Skip to main content

Unified Digital Quickstart

Requirements​

In addition to dbt being installed:

To model web events
To model mobile events
  • mobile events dataset being available in your database
  • Snowplow Android, iOS mobile tracker version 1.1.0 (or later) or React Native tracker implemented
  • Mobile session context enabled (ios or android).
  • Screen view events enabled (ios or android).

Installation​

Make sure to create a new dbt project and import this package via the packages.yml as recommended by dbt, or add to an existing top level project. Do not fork the packages themselves.

Check dbt Hub for the latest installation instructions, or read the dbt docs for more information on installing packages. If you are using multiple packages you may need to up/downgrade a specific package to ensure compatibility.

packages.yml

packages:
- package: snowplow/snowplow_unified
version: 0.4.0
note

Make sure to run the dbt deps command after updating your packages.yml to ensure you have the specified version of each package installed in your project.

Setup​

1. Override the dispatch order in your project​

To take advantage of the optimized upsert that the Snowplow packages offer you need to ensure that certain macros are called from snowplow_utils first before dbt-core. This can be achieved by adding the following to the top level of your dbt_project.yml file:

dbt_project.yml
dispatch:
- macro_namespace: dbt
search_order: ['snowplow_utils', 'dbt']

If you do not do this the package will still work, but the incremental upserts will become more costly over time.

2. Adding the selectors.yml file​

Within the packages we have provided a suite of suggested selectors to run and test the models within the package together with the Unified Digital Model. This leverages dbt's selector flag. You can find out more about each selector in the YAML Selectors section.

These are defined in the selectors.yml file (source) within the package, however in order to use these selections you will need to copy this file into your own dbt project directory. This is a top-level file and therefore should sit alongside your dbt_project.yml file. If you are using multiple packages in your project you will need to combine the contents of these into a single file.

3. Check source data​

This package will by default assume your Snowplow events data is contained in the atomic schema of your target.database. In order to change this, please add the following to your dbt_project.yml file:

dbt_project.yml
vars:
snowplow_unified:
snowplow__atomic_schema: schema_with_snowplow_events
snowplow__database: database_with_snowplow_events
Databricks only

Please note that your target.database is NULL if using Databricks. In Databricks, schemas and databases are used interchangeably and in the dbt implementation of Databricks therefore we always use the schema value, so adjust your snowplow__atomic_schema value if you need to.

Next, Unified Digital assumes you are modeling both web and mobile events and expects certain fields to exist based on this. If you are only tracking and modeling e.g. web data, you can disable the other as below:

dbt_project.yml
vars:
snowplow_unified:
snowplow__enable_mobile: false
snowplow__enable_web: true

Note these are both true by default so you only need to add the one you wish to disable.

4. Enabled desired contexts​

The Unified Digital Model has the option to join in data from the following Snowplow enrichments and out-of-the-box context entities:

  • IAB enrichment
  • UA Parser enrichment
  • YAUAA enrichment
  • Browser context
  • Mobile context
  • Geolocation context
  • App context
  • Screen context
  • Deep Link context
  • App Error context
  • Core Web Vitals
  • Consent (Preferences & cmp visible)
  • Mobile screen summary (used for screen engagement calculation)

By default these are all disabled in the Unified Digital Model. Assuming you have the enrichments turned on in your Snowplow pipeline, to enable the contexts within the package please add the following to your dbt_project.yml file:

dbt_project.yml
vars:
snowplow_unified:
snowplow__enable_iab: true
snowplow__enable_ua: true
snowplow__enable_yauaa: true
snowplow__enable_browser_context: true
snowplow__enable_mobile_context: true
snowplow__enable_geolocation_context: true
snowplow__enable_application_context: true
snowplow__enable_screen_context: true
snowplow__enable_deep_link_context: true
snowplow__enable_consent: true
snowplow__enable_cwv: true
snowplow__enable_app_errors: true
snowplow__enable_screen_summary_context: true

5. Filter your data set​

You can specify both start_date at which to start processing events and the app_id's to filter for. By default the start_date is set to 2020-01-01 and all app_id's are selected. To change this please add the following to your dbt_project.yml file:

dbt_project.yml
vars:
snowplow_unified:
snowplow__start_date: 'yyyy-mm-dd'
snowplow__app_id: ['my_app_1','my_app_2']

6. Verify page ping variables​

The Unified Digital Model processes page ping events to calculate web page engagement times. If your tracker configuration for min_visit_length (default 5) and heartbeat (default 10) differs from the defaults provided in this package, you can override by adding to your dbt_project.yml:

dbt_project.yml
vars:
snowplow_unified:
snowplow__min_visit_length: 5 # Default value
snowplow__heartbeat: 10 # Default value

7. Additional vendor specific configuration​

BigQuery Only

Verify which column your events table is partitioned on. It will likely be partitioned on collector_tstamp or derived_tstamp. If it is partitioned on collector_tstamp you should set snowplow__derived_tstamp_partitioned to false. This will ensure only the collector_tstamp column is used for partition pruning when querying the events table:

dbt_project.yml
vars:
snowplow_unified:
snowplow__derived_tstamp_partitioned: false
Databricks only - setting the databricks_catalog

Add the following variable to your dbt project's dbt_project.yml file

dbt_project.yml
vars:
snowplow_unified:
snowplow__databricks_catalog: 'hive_metastore'

Depending on the use case it should either be the catalog (for Unity Catalog users from databricks connector 1.1.1 onwards, defaulted to 'hive_metastore') or the same value as your snowplow__atomic_schema (unless changed it should be 'atomic'). This is needed to handle the database property within models/base/src_base.yml.

A more detailed explanation for how to set up your Databricks configuration properly can be found in Unity Catalog support.

8. Run your model​

You can run your models for the first time by running the below command (see the operation page for more information on operation of the package). As this package contains some seed files, you will need to seed these first

dbt seed --select snowplow_unified --full-refresh
dbt run --selector snowplow_unified

9. Enable extras​

The package comes with additional modules and functionality that you can enable, for more information see the consent tracking, conversions, and core web vitals documentation.

Was this page helpful?