Skip to main content
Snowplow

Receive events from Snowplow

Updated over a week ago

Introduction

Snowplow is a popular choice for event instrumentation. If you are using Snowplow, you can share your event data with Kubit in a couple of ways:

  1. If you are already storing the Snowplow event tables in Snowflake/BigQuery/Redshift/Databricks the recommended integration approach is to share them with Kubit by following the corresponding Direct Connect guide.

  2. In case you haven't built your data warehouse yet, you can configure a Snowplow destination, targeting the Kubit Snowflake account.

Snowflake Destination Integration Steps

  1. The Kubit team will provide you with the Snowflake credentials for the destination.

  2. Follow the Snowflake destination guide depending on your Snowplow plan:

    1. Enterprise/Open source:

πŸ‘ Tip

We recommend picking the Spark transformer as it supports deduplication prior to loading the data into the data warehouse.

Handling Historical Data

Snowplow does not store any data and thus there are no built-in features to replay historical data. Events start streaming from the moment the Destination is configured. In case you need some historical tail and are already storing the event stream in a data warehouse we suggest sharing your data through Direct Connect instead of using a Destination.

Did this answer your question?