You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[Bucket](https://bucket.co/?utm_source=segmentio&utm_medium=docs&utm_campaign=partners){:target="blank"} is feature-focused analytics. Bucket empowers software teams with a repeatable approach to shipping features that customers crave.
Copy file name to clipboardExpand all lines: src/connections/reverse-etl/setup.md
+30-40Lines changed: 30 additions & 40 deletions
Original file line number
Diff line number
Diff line change
@@ -14,7 +14,7 @@ Follow these 4 steps to set up Reverse ETL:
14
14
4.[Create mappings](#step-4-create-mappings)
15
15
16
16
## Step 1: Add a source
17
-
A source is where your data originates from. Traditionally in Segment, a [source](/docs/connections/sources/#what-is-a-source) is a website, server library, mobile SDK, or cloud application which can send data into Segment. In Reverse ETL, your data warehouse is the source.
17
+
In Reverse ETL, a source is your data warehouse.
18
18
19
19
> warning ""
20
20
> You need to be a user that has both read and write access to the warehouse.
@@ -24,52 +24,38 @@ To add your warehouse as a source:
24
24
1. Navigate to **Connections > Sources** and select the **Reverse ETL** tab in the Segment app.
25
25
2. Click **+ Add Reverse ETL source**.
26
26
3. Select the source you want to add.
27
-
4. Follow the corresponding setup guide for your Reverse ETL source:
27
+
4. Follow the corresponding guide to set up the required permissions for your Reverse ETL source:
After you add your data warehouse as a source, you can [add a model](#step-2-add-a-model) to your source.
36
-
37
35
## Step 2: Add a model
38
-
Models are SQL queries that define sets of data you want to synchronize to your Reverse ETL destinations. After you add your source, you can add a model.
39
-
40
-
> info "Use Segment's dbt extension to centralize model management and versioning"
41
-
> Users who set up a BigQuery, Databricks, Postgres, Redshift, or Snowflake source can use Segment's [dbt extension](/docs/segment-app/extensions/dbt/) to centralize model management and versioning, reduce redundancies, and run CI checks to prevent breaking changes.
42
-
>
43
-
> Extensions is currently in public beta and is governed by Segment's [First Access and Beta Preview Terms](https://www.twilio.com/en-us/legal/tos){:target="_blank"}. During Public Beta, Extensions is available for Team and Developer plans only. [Reach out to Segment](mailto:[email protected]) if you're on a Business Tier plan and would like to participate in the Public Beta.
36
+
Models define sets of data you want to sync to your Reverse ETL destinations. A source can have multiple models. Segment supports [SQL models](/docs/connections/reverse-etl/setup/#step-4-create-mappings) and [dbt models](/docs/segment-app/extensions/dbt/).
44
37
45
-
To add your first model:
38
+
### SQL editor
46
39
1. Navigate to **Connections > Sources** and select the **Reverse ETL** tab. Select your source and click **Add Model**.
47
40
2. Click **SQL Editor** as your modeling method. (Segment will add more modeling methods in the future.)
48
-
3. Enter the SQL query that’ll define your model. Your model is used to map data to your Reverse ETL destinations.
41
+
3. Enter the SQL query that’ll define your model. Your model is used to map data to your Reverse ETL destination(s).
49
42
4. Choose a column to use as the unique identifier for each record in the **Unique Identifier column** field.
50
-
* The Unique Identifier should be a column with unique values per record to ensure checkpointing works as expected. It can potentially be a primary key. This column is used to detect new, updated, and deleted records.
51
-
5. Click **Preview** to see a preview of the results of your SQL query. The data from the preview is extracted from the first 10 records of your warehouse.
43
+
* The Unique Identifier should be a column with unique values per record to ensure checkpointing works as expected, like a primary key. This column is used to detect new, updated, and deleted records.
44
+
5. Click **Preview** to see a preview the first 10 records for the SQL query.
52
45
* Segment caches preview queries and result sets in the UI, and stores the preview cache at the source level. If you make two queries for the same source, Segment returns identical preview results. However, during the next synchronization, the latest data will be sent to the connected destinations.
53
46
6. Click **Next**.
54
47
7. Enter your **Model Name**.
55
48
8. Click **Create Model**.
56
49
57
-
To add multiple models to your source, repeat steps 1-8 above.
58
-
59
-
### Edit your model
60
-
61
-
To edit your model:
62
-
1. Navigate to **Connections > Destinations** and select the **Reverse ETL** tab.
63
-
2. Select the source and the model you want to edit.
64
-
3. On the overview tab, click **Edit** to edit your query.
65
-
4. Click the **Settings** tab to edit the model name or change the schedule settings.
50
+
### dbt model
51
+
Use Segment's dbt extension to centralize model management and versioning. Users who set up a BigQuery, Databricks, Postgres, Redshift, or Snowflake source can use Segment's [dbt extension](/docs/segment-app/extensions/dbt/) to centralize model management and versioning, reduce redundancies, and run CI checks to prevent breaking changes.
66
52
67
53
## Step 3: Add a destination
68
-
Once you’ve added a model, you need to add a destination. In Reverse ETL, destinations are the business tools or apps you use that Segment syncs the data from your warehouse to.
54
+
In Reverse ETL, destinations are the business tools or apps you use that Segment syncs the data from your warehouse to. A model can have multiple destinations.
69
55
70
-
Reverse ETL supports 30+ destinations: see all destinations listed in the [Reverse ETL catalog](/docs/connections/reverse-etl/reverse-etl-catalog/). If the destination you want to send data to is not listed in the Reverse ETL catalog, use the [Segment Connections Destination](/docs/connections/reverse-etl/reverse-etl-catalog/#segment-connections-destination) to send data from your Reverse ETL warehouse to your destination.
71
-
72
-
Engage users can use the [Segment Profiles Destination](/docs/connections/destinations/catalog/actions-segment-profiles/) to create and update [Profiles](/docs/unify/) that can then be accessed through [Profile API](/docs/unify/profile-api/) and activated within [Twilio Engage](/docs/engage).
56
+
Refer to the [Reverse ETL catalog](/docs/connections/reverse-etl/reverse-etl-catalog/)to view the supported actions destinations. Reverse ETL supports to unique destinations:
57
+
-**[Segment Connections Destination](/docs/connections/reverse-etl/reverse-etl-catalog/#segment-connections-destination)**: Send warehouse data back into Segment to leverage your existing mappings or access non-actions destinations in the Connections catalog.
58
+
-**[Segment Profiles Destination](/docs/connections/destinations/catalog/actions-segment-profiles/)**: Engage Premier Subscriptions users can use Reverse ETL to sync subscription data from their warehouses to destinations.
73
59
74
60
> info "Separate endpoints and credentials required to set up third party destinations"
75
61
> Before you begin setting up your destinations, note that each destination has different authentication requirements. See the documentation for your intended destination for more details.
@@ -84,7 +70,7 @@ To add your first destination:
84
70
7. Navigate to the destination settings tab and enable the destination. If the destination is disabled, then Segment won't be able to start a sync.
85
71
86
72
## Step 4: Create mappings
87
-
After you’ve added a destination, you can create mappings from your warehouse to the destination. Mappings enable you to map the data you extract from your warehouse to the fields in your destination.
73
+
Mappings enable you to map the data you extract from your warehouse to the fields in your destination. A destination can have multiple mappings.
88
74
89
75
To create a mapping:
90
76
1. Navigate to **Connections > Destinations** and select the **Reverse ETL** tab.
@@ -105,24 +91,28 @@ To create a mapping:
105
91
<!---* _(Optional)_ Use the [Suggested Mappings](#suggested-mappings) feature to identify and match near-matching field names to streamline the field mapping process. -->
106
92
8. In the **Send test record section**, select a test record to preview the fields that you mapped to your destination. When you've verified that the records appear as expected, click **Next**.
107
93
9. Enter a name for your mapping. The name initially defaults to the Action's name, for example, `Track Event`, but you can make changes to this default name.
108
-
9. Select the Schedule type for the times you want the model’s data to be extracted from your warehouse. You can choose from:
109
-
***Interval**: Extractions perform based on a selected time cycle.
94
+
10. Select how often you want Segment to sync your data under **Schedule configuration**.
95
+
***Interval**: Extractions perform based on a selected time cycle. Select one of the following options: 15 minutes, 30 minutes, 1 hour, 2 hours, 4 hours, 6 hours, 8 hours, 12 hours, 1 day.
110
96
***Day and time**: Extractions perform at specific times on selected days of the week.
111
-
10. Select how often you want the schedule to sync in**Schedule configuration**.
112
-
* For an**Interval** schedule type, you can choose from: 15 minutes, 30 minutes, 1 hour, 2 hours, 4 hours, 6 hours, 8 hours, 12 hours, 1 day.
113
-
* 15 minutes is considered real-time for warehouse syncs
114
-
* For a **Day and time** schedule type, you can choose the day(s) you’d like the schedule to sync as well as the time.
115
-
* You can only choose to start the extraction at the top of the hour.
116
-
* Scheduling multiple extractions to start at the same time inside the same data warehouse causes extraction errors.
97
+
11. Select the destination you’d like to enable on the **My Destinations** page under**Reverse ETL > Destinations**.
98
+
12. Turn the toggle on for the**Mapping Status**. Events that match the trigger condition in the mapping will be sent to the destination.
99
+
* If you disable the mapping state to the destination, events that match the trigger condition in the mapping won’t be sent to the destination.
100
+
101
+
## Initial sync for a given mapping
102
+
After you've set up your source, model, destination, and mappings for Reverse ETL, your data will extract and sync to your destination(s) right away if you chose an interval schedule. If you set your data to extract at a specific day and time, the extraction will take place then.
117
103
118
-
To add multiple mappings from your warehouse to your destination, repeat steps 1-10 above.
104
+
## Edit Reverse ETL syncs
105
+
### Edit your model
106
+
107
+
To edit your model:
108
+
1. Navigate to **Connections > Destinations** and select the **Reverse ETL** tab.
109
+
2. Select the source and the model you want to edit.
110
+
3. On the overview tab, click **Edit** to edit your query.
111
+
4. Click the **Settings** tab to edit the model name or change the schedule settings.
119
112
120
113
### Edit your mapping
121
114
122
115
To edit your mapping:
123
116
1. Navigate to **Connections > Destinations** and select the **Reverse ETL** tab.
124
117
2. Select the destination and the mapping you want to edit.
125
118
3. Select the **...** three dots and click **Edit mapping**. If you want to delete your mapping, select **Delete**.
126
-
127
-
## Using Reverse ETL
128
-
After you've set up your source, model, destination, and mappings for Reverse ETL, your data will extract and sync to your destination(s) right away if you chose an interval schedule. If you set your data to extract at a specific day and time, the extraction will take place then.
Copy file name to clipboardExpand all lines: src/connections/sources/catalog/libraries/website/javascript/troubleshooting.md
+12Lines changed: 12 additions & 0 deletions
Original file line number
Diff line number
Diff line change
@@ -30,6 +30,18 @@ var writeKey;
30
30
ENV==='production'? writeKey ='A': writeKey ='B';
31
31
```
32
32
33
+
## How do I resolve the 'Failed to Load Analytics.js ChunkLoadError'?
34
+
35
+
The error can occur for different reasons:
36
+
37
+
* Snippet syntax: Ensure you correctly added the Segment snippet to the page. Check for any missing or extra characters. Follow [this guide](/docs/connections/sources/catalog/libraries/website/javascript/quickstart/#step-2-install-segment-to-your-site).
38
+
39
+
* NPM package: If you're using Segment through NPM, refer to [this guide](/docs/connections/sources/catalog/libraries/website/javascript/quickstart/#step-2b-install-segment-as-a-npm-package).
40
+
41
+
* Browser cache: Clear the browser cache, as this is a common cause for `ChunkLoadError`.
42
+
43
+
* Cloudflare caching: If you use Cloudflare to proxy Segment, disable caching for the Segment JS file.
44
+
33
45
## Do you see events appear in your debugger?
34
46
35
47
When you reload the page, does your debugger show a new [`page`](/docs/connections/spec/page)? You can also check the JavaScript console in the browser and manually fire an event, like an Identify call, which would show up in the debugger.
0 commit comments