Data warehouses and lakes
To store your enriched Snowplow data, you will need to determine which loader to set-up to ensure your enriched data reaches your data warehouse or lake.
Selecting a loader for your data warehouse
- AWS
- GCP
Destination | Loader | Status |
---|---|---|
Redshift | Snowplow RDB Loader | Production-ready |
Snowflake | Snowplow RDB Loader | Production-ready |
Databricks | Snowplow RDB Loader | Production-ready |
Postgres | Postgres Loader | Early release |
Destination | Loader | Status |
---|---|---|
Snowflake | Snowplow RDB Loader | Production-ready |
BigQuery | BigQuery Loader | Production-ready |
Postgres | Postgres Loader | Early release |
note
Our warehouse loaders pick up the data from Enrich Kinesis on AWS and Enrich Pubsub on GCP. Both Stream Collector and Enrich allow you to use a streaming technology other than Kinesis and Pub/Sub, but if you go that route, you will need to define your own process to load your enriched data into the warehouse.
Selecting a loader for your data lake
There is support for loading into cloud storage or data lakes via the S3 Loader for AWS or the Google Cloud Storage Loader for GCP.