Skip to content

Add page for Snowplow Local #1214

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion docs/get-started/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,12 +5,13 @@ sidebar_label: "Get started"
description: "Details on where and how Snowplow is deployed"
---

You can choose between Snowplow BDP Enterprise (paid, hosted in your cloud), Snowplow BDP Cloud (paid, hosted by Snowplow) and Snowplow Community Edition (free, hosted in your cloud). See the [feature comparison page](/docs/get-started/feature-comparison/index.md) for more information.
You can choose between Snowplow BDP Enterprise (paid, hosted in your cloud), Snowplow BDP Cloud (paid, hosted by Snowplow), Snowplow Community Edition (free, hosted in your cloud) and Snowplow Local (free, hosted on your local machine for development). See the [feature comparison page](/docs/get-started/feature-comparison/index.md) for more information.

Each offering has its own setup guide:
* [Snowplow BDP Enterprise](/docs/get-started/snowplow-bdp/index.md)
* [Snowplow BDP Cloud](/docs/get-started/snowplow-bdp/index.md)
* [Snowplow Community Edition](/docs/get-started/snowplow-community-edition/index.md)
* [Snowplow Local](/docs/get-started/snowplow-local/index.md)

## Snowplow BDP Enterprise

Expand Down
44 changes: 44 additions & 0 deletions docs/get-started/snowplow-local/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
---
title: "Setting up Snowplow Local"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the page fits better in the API Reference section, below Micro and Mini. Please can you add a couple of details too:

  • a paragraph explaining why/when you'd use this pipeline over Micro/Mini
  • something about licensing and product support

date: "2025-04-14"
sidebar_position: 4
sidebar_label: "Snowplow Local"
---

## What is Snowplow Local?

Snowplow Local is a local-first limited version of the Snowplow data pipeline that's developer focused by allowing you to run the core Snowplow components locally.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what does "developer focused" mean?


This includes the Collector and Enricher as well as any of the streaming loaders for loading into BigQuery, Snowflake, Databricks or a data lake.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
This includes the Collector and Enricher as well as any of the streaming loaders for loading into BigQuery, Snowflake, Databricks or a data lake.
This includes the Collector and Enrich as well as any of the streaming loaders for loading into BigQuery, Snowflake, Databricks or a data lake.


Snowplow Local represents a quick way to get up and running with minimal effort. It uses Docker Compose to spin up and includes a basic UI to track events so you can see them undergoing enrichment throughout your pipeline in real time.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Snowplow Local represents a quick way to get up and running with minimal effort. It uses Docker Compose to spin up and includes a basic UI to track events so you can see them undergoing enrichment throughout your pipeline in real time.
Snowplow Local uses Docker Compose to spin up and includes a basic UI to track events, so you can see them undergoing enrichment throughout your pipeline in real time.


Although Snowplow Local provides the core software for collecting, enriching, and storing events it only contains a minimal user interface for the control plane. For a more comprehensive UI consider [Snowplow BDP](https://docs.snowplow.io/docs/get-started/snowplow-bdp/) instead.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

needs screenshot of control plane, info about UI


## Who is Snowplow Local for?

Snowplow Local is designed for developers, data engineers and data scientists who want to develop, test and debug Snowplow pipelines locally without needing to deploy to a cloud environment. It's also useful for those who want to test or experiment with new features or changes to the pipeline without affecting a production environment.

Snowplow Local requires some familiarity with Docker and as a result is best suited for more technical users that are comfortable with the basics of the command line and editing configuration files.

## What can you do with Snowplow Local?
[*] Develop and test new schemas and enrichments
[*] Test out new loaders (e.g., Snowflake, BigQuery, Lake Loader)
[*] View bad, incomplete and good events in an easy-to-use user interface
[*] Test changes to the pipeline configuration (Collector, Enrich, etc)
[*] Stream data to your data warehouse or lake of choice
[*] Monitor pipeline performance and metrics using Grafana
[*] Test new or existing versions of the Snowplow pipeline
[*] Write enriched data to remote destinations (including S3, GCS etc)
[*] Test and validate Snowbridge configurations
[*] Send events remotely from another machine to your local pipeline (via --profile tunnel)
[*] Query data locally using DuckDB (when using the Lake Loader and Iceberg or Delta format)

## What you will need

You will need docker and docker compose installed, as well as access to the [GitHub repository](https://github.com/snowplow-incubator/snowplow-local).

You don't need access to any cloud environments or specific credentials.

Snowplow Local is available on GitHub [here](https://github.com/snowplow-incubator/snowplow-local). It's straightforward to spin up - for instructions see the [README.md](https://github.com/snowplow-incubator/snowplow-local/blob/main/README.md)