Beautiful dashboards, clean data visualization, and live data streams. All features of the open source data dashboard platform Metabase.
However, Firebase Firestore, a popular database choice, isn't exactly Metabase friendly out of the box. In this article, I'll be explaining how to funnel Firebase Firestore data into Metabase, all while running in the cloud on a Repl.
I'm going to assume you already have Firebase Firestore set up, but if you don't, start here. Make sure to be on the Blaze plan.
Funneling Data from Firebase to Google BigQuery
Because Metabase utilizes SQL queries, we've got to convert our document based database(Firestore) to a SQL-table based one(Google BigQuery). Thankfully, the team @ Firebase has created an extension to automatically track documents within a collection and stream them to a Google BigQuery database. We'll start by setting up this extension.
Navigate to the Google BigQuery Firebase Extension and then click on "Install in Console."
Select the project you'd like to add the extension to. For me, I'll be selecting polyglotprod.
In the configuration settings of the extension, change the "collection path" & "table ID" to the Firestore collection that you would like to listen to. Be aware that only the documents in the specified collection will be monitored. If you'd like to listen to sub-collections, you'll need to configure this extension again to listen to sub-collections as well. For me, I'll be monitoring the users collection, so that I can see my user metrics in Metabase. Click "Install Extension" once you are done.
Once the extension is done installing, add a test document in Firebase Firestore so we can check that the extension worked.
Now let's check our Google BigQuery table to make sure everything worked. In the Firebase Extensions dashboard, click on the "Manage" Button.
Navigate to the "How this extension works" page and follow steps 4-6, which should confirm that you can see your test document in the Google BigQuery Table.
Our Firebase data is now in Google BigQuery, but unfortunately, all of the data is squished into a single column labeled "Data," which Metabase can't parse.
With our data now funneled to BigQuery, there are two options to making the data Metabase friendly.
A man-in-the-middle HTTP function that receives the Firestore data and transforms it before it is sent off to BigQuery. However, this approach does not work because BigQuery expects the form of the HTTP request & response to be identical, so changing the SQL column labels is impossible(it took 3 hours of Deno deployments to get that through my head).
Creating a BigQuery Database schema that maps the values of our current database to a new Metabase-friendly database view. That's what we want.
Thankfully, the process isn't too complicated. You'll need to create a json map(or "schema") of your Firestore document properties to SQL columns. Fork the Repl below and follow the instructions in the Readme!
Import Existing Documents into BigQuery
Now that our database schemas are set up correctly, all new & updated documents will be correctly added to the BigQuery database.
However, what if we want to import all the documents that already exist in Firestore? Time for another Repl & handy script. Once again, the instructions are in the Readme!
Setting up Metabase
To host your own Metabase instance, fork the below Repl, and hit "run." It may take up to 3 minutes for Metabase to initialize.
(Pro-tip: If you're on the Replit Hacker/Teams Pro plan, you can increase the ram limits for a smoother experience - checkout the readme!)
The Power of the Cloud ☁️
Once you've reached step 3 of the Metabase Setup, you'll need to connect your Google BigQuery Database to Metabase. Click on the Google BigQuery Option.
Choose any display name you like, and for the service account JSON file, we'll need to go fishing 🐟 in the Google IAM panel.
Head back the Google Cloud BigQuery console, and in the search bar, type in Service Accounts and select "Service Accounts."
Click the "+Create Service Account" button, and replicate the below details. We'll be giving Metabase "BigQuery Admin" access.
Click "Done." You should be deposited back to the Service Accounts Dashboard, where you'll now need to click on the three dots on the very right of the Metabase Service Account.
Select "Manage keys" Click "ADD KEY" and "Create new key." Make sure the JSON option is selected, and click "CREATE." Save the JSON file somewhere safe. Now let's head back to Metabase. Click "Select a file" under the "Service account JSON file" section and upload your json file.
Click "Next," and start exploring your data to your heart's desires!
Bonus: Deploy your Dashboard to a Custom Domain
Click on the custom domain icon(a pencil) to the right of your domain name.
Click on create a domain link, and type in your domain name. Navigate to the domain management settings of your domain registrar of choice & add the appropriate records.
And viola! Your Metabase dashboard is now accessible at a custom domain!