Apache Superset

Acho can help you import data from spreadsheets or apps and host a database on our server, so you can export any type of data to Apache Superset. The process includes three steps:

  1. Generate database credentials

    Each Acho's account generates only one set of credentials.

  2. Set up a connection in Apache Superset

    Use the credentials generated from Step 1 to configure the connection in Apache Superset. The configuration only needs to be set up for the first time.

  3. Update tables in Apache Superset

    Whenever you export a new table from Acho, you have to complete this step to add the table in Apache Superset.

Generate database credentials

1. Click the export button on the top right of the table.

2. Select Apache Superset and click Export in Workflow.

3. In Workflow, drag Apache Superset from the right sidebar to the canvas. Then, link it to the table that you want to export.

4. Acho generates Dataset ID and a JSON key file automatically. Please use these credentials to set up the connection in Apache Superset.

Note:

  • Each Acho's account has a unique set of credentials for Apache Superset.

  • Every time you export data to Apache Superset, you can find the credentials. However, you only need these credentials for the first time to build the connection in Apache Superset.

Set up a connection in Apache Superset

Superset requires a Python DB-API database driver and a SQLAlchemy dialect to be installed for BigQuery.

2-1. Install BigQuery driver

1. Create requirements-local.txt

# From the repo root...
touch ./docker/requirements-local.tx:

2. Add the driver selected in the step above.

echo "pybigquery" >> ./docker/requirements-local.txt

3. Rebuild your local image with the new driver baked in:

docker-compose build --force-rm

4. After the rebuild of the Docker images is complete (which make take a few minutes) you can relaunch using the following command:

docker-compose up

The other option is to start Superset via Docker Compose is using the recipe in docker-compose-non-dev.yml, which will use pre-built frontend assets and skip the building of front-end assets:

docker-compose -f docker-compose-non-dev.yml up

2-2. Add a database

1. Open your Apache Superset and navigate to Data > Databases. Then, click the + DATABASE button.navigate to Data > Databases. Then, click the + DATABASE button.navigate to Data > Databases. Then, click the + DATABASE button.

2. Choose Google BigQuery under the dropdown of other databases

3. Upload the JSON key file generated from Step 1 and click CONNECT.

4. Once the connection is set up, click FINISH.

Update tables in Apache Superset

  1. Navigate to Data > Datasets. Then, click the + DATASET button.

2. Specify the fields below:

  • DATABASE: Choose the BigQuery database that you just connected in Step 2.

  • SCHEMA: Select apache_supset_dataset_XXX (XXX is your unique account id on Acho.)

  • SEE TABLE SCHEMA: Select a table that you want to import. You can see a list of tables that you already exported from Acho here.

Last updated