Apache Superset
Acho can help you import data from spreadsheets or apps and host a database on our server, so you can export any type of data to Apache Superset. The process includes three steps:
  1. 1.
    Each Acho's account generates only one set of credentials.
  2. 2.
    Use the credentials generated from Step 1 to configure the connection in Apache Superset. The configuration only needs to be set up for the first time.
  3. 3.
    Whenever you export a new table from Acho, you have to complete this step to add the table in Apache Superset.

Step 1: Generate database credentials

1. Click the export button on the top right of the table.
​
2. Select Apache Superset and click Generate Credentials.
3. Acho generates Dataset ID and a JSON key file automatically.
Notice that:
  • Each Acho's account has a unique set of credentials for Apache Superset.
  • Every time you export data to Metabase, you can find the credentials. However, you only need these credentials for the first time to build the connection in Apache Superset.

Step 2: Set up a connection in Apache Superset

Superset requires a Python DB-API database driver and a SQLAlchemy dialect to be installed for BigQuery.

2-1. Install BigQuery driver

1. Create requirements-local.txt
1
# From the repo root...
2
touch ./docker/requirements-local.tx:
Copied!
2. Add the driver selected in the step above.
1
echo "pybigquery" >> ./docker/requirements-local.txt
Copied!
3. Rebuild your local image with the new driver baked in:
1
docker-compose build --force-rm
Copied!
4. After the rebuild of the Docker images is complete (which make take a few minutes) you can relaunch using the following command:
1
docker-compose up
Copied!
The other option is to start Superset via Docker Compose is using the recipe in docker-compose-non-dev.yml, which will use pre-built frontend assets and skip the building of front-end assets:
1
docker-compose -f docker-compose-non-dev.yml up
Copied!

2-2. Add a database

1. Open your Apache Superset and navigate to Data > Databases. Then, click the + DATABASE button.navigate to Data > Databases. Then, click the + DATABASE button.navigate to Data > Databases. Then, click the + DATABASE button.
2. Choose Google BigQuery under the dropdown of other databases
3. Upload the JSON key file generated from Step 1 and click CONNECT.
4. Once the connection is set up, click FINISH.

Step 3: Get the datasets exported from Acho

  1. 1.
    Navigate to Data > Datasets. Then, click the + DATASET button.
2. Specify the fields below:
  • DATABASE: Choose the BigQuery database that you just connected in Step 2.
  • SCHEMA: Select apache_supset_dataset_XXX (XXX is your unique account id on Acho.)
  • SEE TABLE SCHEMA: Select a table that you want to import. You can see a list of tables that you already exported from Acho here.
​
Last modified 1mo ago