Skip to content

Commit e32c3fb

Browse files
authored
Fix placeholders in Vector, APL samples in dashboard elements (#133)
1 parent 20cfe58 commit e32c3fb

File tree

8 files changed

+87
-43
lines changed

8 files changed

+87
-43
lines changed

dashboard-elements/heatmap.mdx

+5
Original file line numberDiff line numberDiff line change
@@ -23,6 +23,11 @@ Heatmaps represent the distribution of numerical data by grouping values into ra
2323

2424
## Example with Advanced Query Language
2525

26+
```kusto
27+
['http-logs']
28+
| summarize histogram(req_duration_ms, 15) by bin_auto(_time)
29+
```
30+
2631
<Frame>
2732
<img src="/doc-assets/shots/heatmap-apl.png" alt="Heatmap example with Advanced Query Language" />
2833
</Frame>

dashboard-elements/log-stream.mdx

+5
Original file line numberDiff line numberDiff line change
@@ -16,6 +16,11 @@ The log stream dashboard element displays your logs as they come in real-time. E
1616

1717
## Example with Advanced Query Language
1818

19+
```kusto
20+
['sample-http-logs']
21+
| project method, status, content_type
22+
```
23+
1924
<Frame>
2025
<img src="/doc-assets/shots/log-stream-chart-apl.png" />
2126
</Frame>

dashboard-elements/pie-chart.mdx

+5
Original file line numberDiff line numberDiff line change
@@ -23,6 +23,11 @@ Pie charts can illustrate the distribution of different types of event data. Eac
2323

2424
## Example with Advanced Query Language
2525

26+
```kusto
27+
['http-logs']
28+
| summarize count() by status
29+
```
30+
2631
<Frame>
2732
<img src="/doc-assets/shots/pie-chart-apl.png" alt="Pie chart example with Advanced Query Language" />
2833
</Frame>

dashboard-elements/scatter-plot.mdx

+5
Original file line numberDiff line numberDiff line change
@@ -18,6 +18,11 @@ For example, plot response size against response time for an API to see if large
1818

1919
## Example with Advanced Query Language
2020

21+
```kusto
22+
['sample-http-logs']
23+
| summarize avg(req_duration_ms), avg(resp_header_size_bytes) by resp_body_size_bytes
24+
```
25+
2126
<Frame>
2227
<img src="/doc-assets/shots/scatter-chart-apl-2.png" />
2328
</Frame>

dashboard-elements/statistic.mdx

+5
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,11 @@ Statistics dashboard elements display a summary of the selected metrics over a g
1515

1616
## Example with Advanced Query Language
1717

18+
```kusto
19+
['sample-http-logs']
20+
| summarize avg(resp_body_size_bytes)
21+
```
22+
1823
<Frame>
1924
<img src="/doc-assets/shots/apl-chart-statistic-2.png" />
2025
</Frame>

dashboard-elements/table.mdx

+5
Original file line numberDiff line numberDiff line change
@@ -17,6 +17,11 @@ The table dashboard element displays a summary of any attributes from your metri
1717

1818
With this option, the table chart type has the capability to display a non-aggregated view of events.
1919

20+
```kusto
21+
['sample-http-logs']
22+
| summarize avg(resp_body_size_bytes) by bin_auto(_time)
23+
```
24+
2025
<Frame>
2126
<img src="/doc-assets/shots/table-chart-apl.png" />
2227
</Frame>

dashboard-elements/time-series.mdx

+5
Original file line numberDiff line numberDiff line change
@@ -16,6 +16,11 @@ Time series charts show the change in your data over time which can help identif
1616

1717
## Example with Advanced Query Language
1818

19+
```kusto
20+
['sample-http-logs']
21+
| summarize count() by bin_auto(_time)
22+
```
23+
1924
<Frame>
2025
<img src="/doc-assets/shots/timeseries-chart-apl.png" />
2126
</Frame>

send-data/vector.mdx

+52-43
Original file line numberDiff line numberDiff line change
@@ -10,18 +10,23 @@ isPopular: true
1010
popularityOrder: 3
1111
---
1212

13+
import Prerequisites from "/snippets/standard-prerequisites.mdx"
14+
import Replacement from "/snippets/standard-replacement.mdx"
15+
1316
<Frame caption="Vector">
1417
<img src="/doc-assets/shots/vector-axiom.png" alt="Vector" />
1518
</Frame>
1619

1720
Vector is a lightweight and ultra-fast tool for building observability pipelines. It has a built-in support for shipping logs to Axiom through the [`axiom` sink](https://vector.dev/docs/reference/configuration/sinks/axiom/).
1821

22+
<Prerequisites />
23+
1924
## Installation
2025

2126
Follow the [quickstart guide in the Vector documentation](https://vector.dev/docs/setup/quickstart/) to install Vector, and to configure sources and sinks.
2227

2328
<Warning>
24-
If you use Vector version v0.41.1 or earlier, use the `@timestamp` field instead of `_time` to specify the timestamp of the events. For more information, see [Timestamp in legacy Vector versions](#timestamp-in-legacy-vector-versions).
29+
If you use Vector version v0.41.1 (released on September 11, 2024) or earlier, use the `@timestamp` field instead of `_time` to specify the timestamp of the events. For more information, see [Timestamp in legacy Vector versions](#timestamp-in-legacy-vector-versions).
2530

2631
If you upgrade from Vector version v0.41.1 or earlier to a newer version, update your configuration. For more information, see [Upgrade from legacy Vector version](#upgrade-from-legacy-vector-version).
2732
</Warning>
@@ -32,9 +37,7 @@ Send data to Axiom with Vector using the [`file` method](https://vector.dev/docs
3237

3338
The example below configures Vector to read and collect logs from files and send them to Axiom:
3439

35-
1. [Create a dataset in Axiom](/reference/settings#data).
36-
2. [Generate an Axiom API token](/reference/settings#access-overview).
37-
3. Create a vector configuration file `vector.toml` with the following content:
40+
1. Create a vector configuration file `vector.toml` with the following content:
3841

3942
```toml
4043
[sources.VECTOR_SOURCE_ID]
@@ -44,18 +47,18 @@ The example below configures Vector to read and collect logs from files and send
4447
[sinks.SINK_ID]
4548
type = "axiom"
4649
inputs = ["VECTOR_SOURCE_ID"]
47-
token = "AXIOM_API_TOKEN"
48-
dataset = "AXIOM_DATASET"
50+
token = "API_TOKEN"
51+
dataset = "DATASET_NAME"
4952
```
5053

51-
4. In the code above, replace the following:
54+
1. In the code above, replace the following:
5255
- Replace `VECTOR_SOURCE_ID` with the Vector source ID.
5356
- Replace `PATH_TO_LOGS` with the path to the log files. For example, `/var/log/**/*.log`.
5457
- Replace `SINK_ID` with the sink ID.
55-
- Replace `AXIOM_API_TOKEN` with the Axiom API token you have generated.
56-
- Replace `AXIOM_DATASET` with the name of the Axiom dataset where you want to send data.
58+
{/* list separator */}
59+
<Replacement />
5760

58-
5. Run Vector to send logs to Axiom.
61+
1. Run Vector to send logs to Axiom.
5962

6063
### Example with data transformation
6164

@@ -76,11 +79,13 @@ source = '''
7679
[sinks.SINK_ID]
7780
type = "axiom"
7881
inputs = ["filter_json_fields"]
79-
token = "AXIOM_API_TOKEN"
80-
dataset = "AXIOM_DATASET"
82+
token = "API_TOKEN"
83+
dataset = "DATASET_NAME"
8184
```
8285

83-
Replace `FIELD_TO_REMOVE` with the field you want to remove.
86+
- Replace `FIELD_TO_REMOVE` with the field you want to remove.
87+
{/* list separator */}
88+
<Replacement />
8489

8590
<Note>
8691
Any changes to Vector’s `file` method can make the code example above outdated. If this happens, please refer to the [official Vector documentation on the `file` method](https://vector.dev/docs/reference/configuration/sources/file/), and we kindly ask you to inform us of the issue using the feedback tool at the bottom of this page.
@@ -112,17 +117,11 @@ timezone = "local"
112117
[sinks.axiom]
113118
type = "axiom"
114119
inputs = ["my_source_id"]
115-
token = "xaat-1234"
116-
dataset = "vector-dev"
120+
token = "API_TOKEN"
121+
dataset = "DATASET_NAME"
117122
```
118123

119-
DATASET is the name of your dataset. When logs are sent from your vector, it’s stored in a dataset in Axiom.
120-
121-
[See creating a dataset for more](/reference/datasets)
122-
123-
TOKEN is used to ingest or query data to your dataset. API token can be generated from settings on Axiom dashboard.
124-
125-
[See creating an API token for more](/reference/tokens)
124+
<Replacement />
126125

127126
## Send Docker logs to Axiom
128127

@@ -138,13 +137,13 @@ docker_host = "unix:///var/run/docker.sock"
138137
[sinks.axiom]
139138
type = "axiom"
140139
inputs = ["docker_logs"]
141-
dataset = "your_dataset_name" # replace with the name of your Axiom dataset
142-
token = "your_api_token" # replace with your Axiom API token
140+
dataset = "DATASET_NAME"
141+
token = "API_TOKEN"
143142
```
144143

145-
Replace `your_dataset_name` with the name of the [dataset](/reference/datasets) you want to send logs to in Axiom, and `your_api_token` with your [Axiom API token](/reference/tokens).
144+
<Replacement />
146145

147-
- Run Vector: Start Vector with the configuration file you just created:
146+
Run Vector: Start Vector with the configuration file you just created:
148147

149148
```bash
150149
vector --config /path/to/vector.toml
@@ -165,10 +164,12 @@ region = "us-west-2" # replace with the AWS region of your bucket
165164
[sinks.axiom]
166165
type = "axiom"
167166
inputs = ["my_s3_source"]
168-
dataset = "your_dataset_name" # replace with the name of your Axiom dataset
169-
token = "your_api_token" # replace with your Axiom API token
167+
dataset = "DATASET_NAME"
168+
token = "API_TOKEN"
170169
```
171170

171+
<Replacement />
172+
172173
Finally, run Vector with the configuration file using `vector --config ./vector.toml`. This starts Vector and begins reading logs from the specified S3 bucket and sending them to the specified Axiom dataset.
173174

174175
## Send Kafka logs to Axiom
@@ -186,10 +187,12 @@ auto_offset_reset = "earliest" # start reading from the beginning
186187
[sinks.axiom]
187188
type = "axiom"
188189
inputs = ["my_kafka_source"] # connect the Axiom sink to your Kafka source
189-
dataset = "your_dataset_name" # replace with the name of your Axiom dataset
190-
token = "your_api_token" # replace with your Axiom API token
190+
dataset = "DATASET_NAME" # replace with the name of your Axiom dataset
191+
token = "API_TOKEN" # replace with your Axiom API token
191192
```
192193

194+
<Replacement />
195+
193196
Finally, you can start Vector with your configuration file: `vector --config /path/to/your/vector.toml`
194197

195198
## Send NGINX metrics to Axiom
@@ -230,10 +233,12 @@ endpoints = ["http://localhost/metrics"] # the endpoint where NGINX metrics are
230233
[sinks.axiom]
231234
type = "axiom" # must be: axiom
232235
inputs = ["nginx_metrics"] # use the metrics from the NGINX source
233-
dataset = "your_dataset_name" # replace with the name of your Axiom dataset
234-
token = "your_api_token" # replace with your Axiom API token
236+
dataset = "DATASET_NAME" # replace with the name of your Axiom dataset
237+
token = "API_TOKEN" # replace with your Axiom API token
235238
```
236239

240+
<Replacement />
241+
237242
Finally, you can start Vector with your configuration file: `vector --config /path/to/your/vector.toml`
238243

239244
## Send Syslog logs to Axiom
@@ -250,10 +255,12 @@ mode="tcp"
250255
[sinks.axiom]
251256
type="axiom"
252257
inputs = [ "my_source_id" ] # required
253-
dataset="your_dataset_name" # replace with the name of your Axiom dataset
254-
token="your_api_token" # replace with your Axiom API token
258+
dataset="DATASET_NAME" # replace with the name of your Axiom dataset
259+
token="API_TOKEN" # replace with your Axiom API token
255260
```
256261

262+
<Replacement />
263+
257264
## Send Prometheus metrics to Axiom
258265

259266
To send Prometheus scrape metrics using the Axiom sink, you need to create a configuration file, for example, `vector.toml`, with the following code:
@@ -268,17 +275,19 @@ endpoints = ["http://localhost:9090/metrics"] # replace with your Prometheus en
268275
[sinks.axiom]
269276
type = "axiom" # Axiom type
270277
inputs = ["my_prometheus_source"] # connect the Axiom sink to your Prometheus source
271-
dataset = "your_prometheus_dataset" # replace with the name of your Axiom dataset
272-
token = "your_api_token" # replace with your Axiom API token
278+
dataset = "DATASET_NAME" # replace with the name of your Axiom dataset
279+
token = "API_TOKEN" # replace with your Axiom API token
273280
```
274281

282+
<Replacement />
283+
275284
Check out the [advanced configuration on Batch, Buffer configuration, and Encoding on Vector Documentation](https://vector.dev/docs/reference/configuration/sinks/axiom/)
276285

277286
## Timestamp in legacy Vector versions
278287

279-
If you use Vector version v0.41.1 or earlier, use the `@timestamp` field instead of `_time` to specify the timestamp in the event data you send to Axiom. For example: `{"@timestamp":"2022-04-14T21:30:30.658Z..."}`. For more information, see [Requirements of the timestamp field](/reference/field-restrictions#requirements-of-the-timestamp-field). In the case of Vector version v0.41.1 or earlier, the requirements explained on the page apply to the `@timestamp` field, not to `_time`.
288+
If you use Vector version v0.41.1 (released on September 11, 2024) or earlier, use the `@timestamp` field instead of `_time` to specify the timestamp in the event data you send to Axiom. For example: `{"@timestamp":"2022-04-14T21:30:30.658Z..."}`. For more information, see [Requirements of the timestamp field](/reference/field-restrictions#requirements-of-the-timestamp-field). In the case of Vector version v0.41.1 or earlier, the requirements explained on the page apply to the `@timestamp` field, not to `_time`.
280289

281-
If you use Vector version v0.42.0 or newer, use the `_time` field as usual for other collectors.
290+
If you use Vector version v0.42.0 (released on October 21, 2024) or newer, use the `_time` field as usual for other collectors.
282291

283292
### Upgrade from legacy Vector version
284293

@@ -307,8 +316,8 @@ file= 'example.vrl' # See above
307316
[sinks.debug]
308317
type = "axiom"
309318
inputs = [ "migrate" ]
310-
dataset = "my-axiom-dataset" # No change
311-
token = "your-token" # No change
319+
dataset = "DATASET_NAME" # No change
320+
token = "API_TOKEN" # No change
312321

313322
[sinks.debug.encoding]
314323
codec = "json"
@@ -332,9 +341,9 @@ file= 'example.vrl' # See above
332341
type = "axiom"
333342
compression = "gzip" # Set the compression algorithm
334343
inputs = [ "migrate" ]
335-
dataset = "my-axiom-dataset" # No change
336-
token = "your-token" # No change
344+
dataset = "DATASET_NAME" # No change
345+
token = "API_TOKEN" # No change
337346

338347
[sinks.debug.encoding]
339348
codec = "json"
340-
```
349+
```

0 commit comments

Comments
 (0)