Skip to main content

Data Applications

This documentation only applies to Snowplow BDP. See the feature comparison page for more information about the different Snowplow offerings.

Data applications are self-service analytics tools, deployed in your cloud, that help customers extract value from their data quickly by providing templated use-cases for data collection, modeling, and activation. They aim to reduce the technical barrier, making data analysis more accessible beyond just SQL users.

Pipeline showing data flowing from tracked events into data apps

Available Data Appsโ€‹

Accessing the Applicationsโ€‹

You can find the Data Applications in the Applications tab in the left sidebar of your Snowplow Console. If the status is not Live, you can click on the tile and request access. A Snowplow Customer Success Manager will then get in contact with you to discuss the Snowplow Digital Analytics package.

Once the application is installed, clicking on the tile will launch the application in a separate browser tab. By default, anyone in your Console organization will be able to access data applications.

If you wish to invite others to use data applications but not have access to the rest of Console, you can create a new user and add a user with the Data applications user role. That user will then only see the Data Applications tab within Console. This permissions can be managed in the usual way.

General Usageโ€‹

Is the app running?โ€‹

When the app is doing some calculations, querying the database, or otherwise still loading, you'll see the following gif in the top right of the app. You may particularly notice this on applications with multiple tabs per page, as the tabs will load in order so the last tab may seem empty until this processing is completed.

Gif of stick people running, swimming, etc.


Where the app has some requirements it will also have a Settings page that will validate what is available to the app, and provide information for steps to take for any unfulfilled requirements.

Chart Sourcesโ€‹

Many of our apps support the exporting of the SQL used to generate the charts. In some cases, there may be a specific button in the app to do this, but for most cases simply look for the icon and click it to download the SQL used to make that chart!


Note that some data is processed further after the query to get in the format required for plotting, which may include actions such as filtering, pivoting, etc.


Our apps provide useful help text throughout the apps, keep an eye out for the help icon () to provide more context or help in using some functionality.

Log Outโ€‹

If you wish to Log out of the application, you can do this on the Account page of any application. Note this also logs you out of Console.

One off-setupโ€‹

To setup the app and have it connect to your warehouse we require a user/role to run the app via, the below steps provide more information on this.

Once you have purchased the Snowplow Digital Analytics package, or other package containing Data Apps, Snowplowโ€™s Infrastructure Support team will contact you to review the setup details, and then deploy the data applications into your cloud environment.

Warehouse permissionsโ€‹

All data applications will require warehouse user credentials with read access to the tables powering the data applications. There are two options:

  1. Reuse the Data Modeling User - in this case we will use the existing Data Modeling user you created during your on-boarding. Note that this user may need additional table access granted to it to use your own tables in any of the apps e.g. the funnel builder.
  2. [Recommended] Create a new Data Applications user - you can also create a new warehouse user with more fine-grained permissions. To create this, you can run the following script and then pass us the details via the secure messaging in console.

While possible to re-use the same user for each data app, it may be beneficial to have a user per app for easier management of access and for logging purposes. In this case, please alter the scripts below to name each user/role as required.

Create and share a GCP Service Account with us that has the following roles attached to it:

data_app_roles = [

Once created share the following details back with us:

  1. Project ID
  2. BigQuery Dataset ID (indicator of where the app should default into, but is configurable in most apps once launched)
  3. BigQuery Dataset Region (indicator of where the app should default into, but is configurable in most apps once launched)
  4. BigQuery Service Account JSON (please send this in a secure format)

This grants read access to all tables within a schema for increased flexibility in applications but you can of course limit to just specific tables or views as required.

Once you have passed us the user credentials, there may be some additional setup steps or requirements for each data app.

Was this page helpful?