Skip to content

Commit 599eb8a

Browse files
Merge pull request #6593 from segmentio/batching-docs
Add context about batching to destination overview page [DOC-904]
2 parents d82e6d2 + c96c547 commit 599eb8a

File tree

1 file changed

+30
-3
lines changed

1 file changed

+30
-3
lines changed

src/connections/destinations/index.md

Lines changed: 30 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -123,9 +123,15 @@ To add a Destination:
123123

124124
## Data deliverability
125125

126-
Segment increases deliverability to destinations in two ways: [retries](#retries) and [replays](/docs/guides/what-is-replay/). Retries happen automatically for all customers, while replays are available on request for [Business](https://segment.com/pricing/) customers.
126+
Segment increases deliverability to destinations using [retries](#retries) and [replays](/docs/guides/what-is-replay/). Retries happen automatically for all customers, while replays are available on request for [Business Tier](https://segment.com/pricing/) customers.
127127

128-
**Note:** Segment's data flow is primarily unidirectional, from Segment to integrated destinations. Segment does not inherently support a bidirectional flow where events, once delivered and processed by a destination, are sent back to Segment.
128+
> info ""
129+
> Segment's data flow is primarily unidirectional, from Segment to integrated destinations. Segment does not inherently support a bidirectional flow where events, once delivered and processed by a destination, are sent back to Segment.
130+
131+
Segment also uses [batching](#batching) to increase deliverability to your destinations. Some destinations have batching enabled by default, and some, like Segment's [Webhook (Actions) Destination](/docs/connections/destinations/catalog/actions-webhook/), let you opt in to batching.
132+
133+
> warning "Some cases of event batching might lead to observability loss"
134+
> While batching does increase event deliverability, you might experience error amplification, as if the entire batch fails, all events will be marked with the same status. For example, if a batch fails due to one `429` (Rate Limit) error, it might appear in the UI that there was one 429s request failure for each item in the batch.
129135
130136
### Retries
131137

@@ -173,8 +179,29 @@ You can see the current destination endpoint API success rates and final deliver
173179
[Replays](/docs/guides/what-is-replay/) allow customers to load historical data from Segment's S3 logs into downstream destinations which accept cloud-mode data. So, for example, if you wanted to try out a new email or analytics tool, Segment can replay your historical data into that tool. This gives you a great testing environment and prevents data lock-in when vendors try to hold data hostage.
174180

175181
> warning ""
176-
> If you submitted [`suppress_only` requests](https://segment.com/docs/privacy/user-deletion-and-suppression/#suppressed-users), Segment still retains historical events for those users, which can be replayed. If you do not want historical events replayed for suppressed users, submit `suppress_and_delete` requests instead.
182+
> If you submitted [`suppress_only` requests](/docs/privacy/user-deletion-and-suppression/#suppressed-users), Segment still retains historical events for those users, which can be replayed. If you do not want historical events replayed for suppressed users, submit `suppress_and_delete` requests instead.
183+
184+
### Batching
185+
186+
Segment uses [stream batching](#stream-batching) for all destinations that require near-realtime data and [bulk batching](#bulk-batching) for some data flows in our pipeline.
187+
188+
#### Stream batching
189+
For all destinations, except for non-realtime Engage syncs and Reverse ETL syncs, Segment processes events from your source as they arrive and then flows the data downstream to your destinations in small batches, in a process called **stream batching**. These batches might contain different events between retry attempts, as events in previous batches may have succeeded, failed with a permanent error, or expired. This variability reduces the workload the system processes during partial successes, allows for better per-event handling, and reduces the chance of load-related failures by using variable batch formations.
190+
191+
#### Bulk batching
192+
Some data flows may be able to use a process called **bulk batching**, which supports batching for destinations that produce between several thousand and a million events at a time. Real-time workloads or using a Destination Insert Function may prevent bulk batches from being formed. Batches contain the same events between retries.
193+
194+
The following destinations support bulk batching:
195+
- [DV360](/docs/connections/destinations/catalog/actions-display-video-360/)
196+
- [Google Adwords Remarketing Lists](/docs/connections/destinations/catalog/adwords-remarketing-lists/)
197+
- [Klaviyo (Actions)](/docs/connections/destinations/catalog/actions-klaviyo/)
198+
- [Pinterest Audiences](/docs/connections/destinations/catalog/pinterest-audiences/)
199+
- [Snapchat Audiences](/docs/connections/destinations/catalog/snapchat-audiences/)
200+
- [LiveRamp](/docs/connections/destinations/catalog/actions-liveramp-audiences/)
201+
- [The Trade Desk CRM](/docs/connections/destinations/catalog/actions-the-trade-desk-crm/)
177202

203+
> info "You must manually configure bulk batches for Actions destinations"
204+
> To support bulk batching for the Actions Webhook destination, you must set `enable-batching: true` and `batch_size: >= 1000`.
178205
179206
### IP Allowlist
180207

0 commit comments

Comments
 (0)