You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Vector is a lightweight and ultra-fast tool for building observability pipelines. It has a built-in support for shipping logs to Axiom through the [`axiom` sink](https://vector.dev/docs/reference/configuration/sinks/axiom/).
18
21
22
+
<Prerequisites />
23
+
19
24
## Installation
20
25
21
26
Follow the [quickstart guide in the Vector documentation](https://vector.dev/docs/setup/quickstart/) to install Vector, and to configure sources and sinks.
22
27
23
28
<Warning>
24
-
If you use Vector version v0.41.1 or earlier, use the `@timestamp` field instead of `_time` to specify the timestamp of the events. For more information, see [Timestamp in legacy Vector versions](#timestamp-in-legacy-vector-versions).
29
+
If you use Vector version v0.41.1 (released on September 11, 2024) or earlier, use the `@timestamp` field instead of `_time` to specify the timestamp of the events. For more information, see [Timestamp in legacy Vector versions](#timestamp-in-legacy-vector-versions).
25
30
26
31
If you upgrade from Vector version v0.41.1 or earlier to a newer version, update your configuration. For more information, see [Upgrade from legacy Vector version](#upgrade-from-legacy-vector-version).
27
32
</Warning>
@@ -32,9 +37,7 @@ Send data to Axiom with Vector using the [`file` method](https://vector.dev/docs
32
37
33
38
The example below configures Vector to read and collect logs from files and send them to Axiom:
34
39
35
-
1.[Create a dataset in Axiom](/reference/settings#data).
36
-
2.[Generate an Axiom API token](/reference/settings#access-overview).
37
-
3. Create a vector configuration file `vector.toml` with the following content:
40
+
1. Create a vector configuration file `vector.toml` with the following content:
38
41
39
42
```toml
40
43
[sources.VECTOR_SOURCE_ID]
@@ -44,18 +47,18 @@ The example below configures Vector to read and collect logs from files and send
44
47
[sinks.SINK_ID]
45
48
type = "axiom"
46
49
inputs = ["VECTOR_SOURCE_ID"]
47
-
token = "AXIOM_API_TOKEN"
48
-
dataset = "AXIOM_DATASET"
50
+
token = "API_TOKEN"
51
+
dataset = "DATASET_NAME"
49
52
```
50
53
51
-
4. In the code above, replace the following:
54
+
1. In the code above, replace the following:
52
55
- Replace `VECTOR_SOURCE_ID` with the Vector source ID.
53
56
- Replace `PATH_TO_LOGS` with the path to the log files. For example, `/var/log/**/*.log`.
54
57
- Replace `SINK_ID` with the sink ID.
55
-
- Replace `AXIOM_API_TOKEN` with the Axiom API token you have generated.
56
-
- Replace `AXIOM_DATASET` with the name of the Axiom dataset where you want to send data.
58
+
{/* list separator */}
59
+
<Replacement />
57
60
58
-
5. Run Vector to send logs to Axiom.
61
+
1. Run Vector to send logs to Axiom.
59
62
60
63
### Example with data transformation
61
64
@@ -76,11 +79,13 @@ source = '''
76
79
[sinks.SINK_ID]
77
80
type = "axiom"
78
81
inputs = ["filter_json_fields"]
79
-
token = "AXIOM_API_TOKEN"
80
-
dataset = "AXIOM_DATASET"
82
+
token = "API_TOKEN"
83
+
dataset = "DATASET_NAME"
81
84
```
82
85
83
-
Replace `FIELD_TO_REMOVE` with the field you want to remove.
86
+
- Replace `FIELD_TO_REMOVE` with the field you want to remove.
87
+
{/* list separator */}
88
+
<Replacement />
84
89
85
90
<Note>
86
91
Any changes to Vector’s `file` method can make the code example above outdated. If this happens, please refer to the [official Vector documentation on the `file` method](https://vector.dev/docs/reference/configuration/sources/file/), and we kindly ask you to inform us of the issue using the feedback tool at the bottom of this page.
@@ -112,17 +117,11 @@ timezone = "local"
112
117
[sinks.axiom]
113
118
type = "axiom"
114
119
inputs = ["my_source_id"]
115
-
token = "xaat-1234"
116
-
dataset = "vector-dev"
120
+
token = "API_TOKEN"
121
+
dataset = "DATASET_NAME"
117
122
```
118
123
119
-
DATASET is the name of your dataset. When logs are sent from your vector, it’s stored in a dataset in Axiom.
120
-
121
-
[See creating a dataset for more](/reference/datasets)
122
-
123
-
TOKEN is used to ingest or query data to your dataset. API token can be generated from settings on Axiom dashboard.
124
-
125
-
[See creating an API token for more](/reference/tokens)
dataset = "your_dataset_name"# replace with the name of your Axiom dataset
142
-
token = "your_api_token"# replace with your Axiom API token
140
+
dataset = "DATASET_NAME"
141
+
token = "API_TOKEN"
143
142
```
144
143
145
-
Replace `your_dataset_name` with the name of the [dataset](/reference/datasets) you want to send logs to in Axiom, and `your_api_token` with your [Axiom API token](/reference/tokens).
144
+
<Replacement />
146
145
147
-
-Run Vector: Start Vector with the configuration file you just created:
146
+
Run Vector: Start Vector with the configuration file you just created:
148
147
149
148
```bash
150
149
vector --config /path/to/vector.toml
@@ -165,10 +164,12 @@ region = "us-west-2" # replace with the AWS region of your bucket
165
164
[sinks.axiom]
166
165
type = "axiom"
167
166
inputs = ["my_s3_source"]
168
-
dataset = "your_dataset_name"# replace with the name of your Axiom dataset
169
-
token = "your_api_token"# replace with your Axiom API token
167
+
dataset = "DATASET_NAME"
168
+
token = "API_TOKEN"
170
169
```
171
170
171
+
<Replacement />
172
+
172
173
Finally, run Vector with the configuration file using `vector --config ./vector.toml`. This starts Vector and begins reading logs from the specified S3 bucket and sending them to the specified Axiom dataset.
173
174
174
175
## Send Kafka logs to Axiom
@@ -186,10 +187,12 @@ auto_offset_reset = "earliest" # start reading from the beginning
186
187
[sinks.axiom]
187
188
type = "axiom"
188
189
inputs = ["my_kafka_source"] # connect the Axiom sink to your Kafka source
189
-
dataset = "your_dataset_name"# replace with the name of your Axiom dataset
190
-
token = "your_api_token"# replace with your Axiom API token
190
+
dataset = "DATASET_NAME"# replace with the name of your Axiom dataset
191
+
token = "API_TOKEN"# replace with your Axiom API token
191
192
```
192
193
194
+
<Replacement />
195
+
193
196
Finally, you can start Vector with your configuration file: `vector --config /path/to/your/vector.toml`
194
197
195
198
## Send NGINX metrics to Axiom
@@ -230,10 +233,12 @@ endpoints = ["http://localhost/metrics"] # the endpoint where NGINX metrics are
230
233
[sinks.axiom]
231
234
type = "axiom"# must be: axiom
232
235
inputs = ["nginx_metrics"] # use the metrics from the NGINX source
233
-
dataset = "your_dataset_name"# replace with the name of your Axiom dataset
234
-
token = "your_api_token"# replace with your Axiom API token
236
+
dataset = "DATASET_NAME"# replace with the name of your Axiom dataset
237
+
token = "API_TOKEN"# replace with your Axiom API token
235
238
```
236
239
240
+
<Replacement />
241
+
237
242
Finally, you can start Vector with your configuration file: `vector --config /path/to/your/vector.toml`
238
243
239
244
## Send Syslog logs to Axiom
@@ -250,10 +255,12 @@ mode="tcp"
250
255
[sinks.axiom]
251
256
type="axiom"
252
257
inputs = [ "my_source_id" ] # required
253
-
dataset="your_dataset_name"# replace with the name of your Axiom dataset
254
-
token="your_api_token"# replace with your Axiom API token
258
+
dataset="DATASET_NAME"# replace with the name of your Axiom dataset
259
+
token="API_TOKEN"# replace with your Axiom API token
255
260
```
256
261
262
+
<Replacement />
263
+
257
264
## Send Prometheus metrics to Axiom
258
265
259
266
To send Prometheus scrape metrics using the Axiom sink, you need to create a configuration file, for example, `vector.toml`, with the following code:
@@ -268,17 +275,19 @@ endpoints = ["http://localhost:9090/metrics"] # replace with your Prometheus en
268
275
[sinks.axiom]
269
276
type = "axiom"# Axiom type
270
277
inputs = ["my_prometheus_source"] # connect the Axiom sink to your Prometheus source
271
-
dataset = "your_prometheus_dataset"# replace with the name of your Axiom dataset
272
-
token = "your_api_token"# replace with your Axiom API token
278
+
dataset = "DATASET_NAME"# replace with the name of your Axiom dataset
279
+
token = "API_TOKEN"# replace with your Axiom API token
273
280
```
274
281
282
+
<Replacement />
283
+
275
284
Check out the [advanced configuration on Batch, Buffer configuration, and Encoding on Vector Documentation](https://vector.dev/docs/reference/configuration/sinks/axiom/)
276
285
277
286
## Timestamp in legacy Vector versions
278
287
279
-
If you use Vector version v0.41.1 or earlier, use the `@timestamp` field instead of `_time` to specify the timestamp in the event data you send to Axiom. For example: `{"@timestamp":"2022-04-14T21:30:30.658Z..."}`. For more information, see [Requirements of the timestamp field](/reference/field-restrictions#requirements-of-the-timestamp-field). In the case of Vector version v0.41.1 or earlier, the requirements explained on the page apply to the `@timestamp` field, not to `_time`.
288
+
If you use Vector version v0.41.1 (released on September 11, 2024) or earlier, use the `@timestamp` field instead of `_time` to specify the timestamp in the event data you send to Axiom. For example: `{"@timestamp":"2022-04-14T21:30:30.658Z..."}`. For more information, see [Requirements of the timestamp field](/reference/field-restrictions#requirements-of-the-timestamp-field). In the case of Vector version v0.41.1 or earlier, the requirements explained on the page apply to the `@timestamp` field, not to `_time`.
280
289
281
-
If you use Vector version v0.42.0 or newer, use the `_time` field as usual for other collectors.
290
+
If you use Vector version v0.42.0 (released on October 21, 2024) or newer, use the `_time` field as usual for other collectors.
282
291
283
292
### Upgrade from legacy Vector version
284
293
@@ -307,8 +316,8 @@ file= 'example.vrl' # See above
307
316
[sinks.debug]
308
317
type = "axiom"
309
318
inputs = [ "migrate" ]
310
-
dataset = "my-axiom-dataset"# No change
311
-
token = "your-token"# No change
319
+
dataset = "DATASET_NAME"# No change
320
+
token = "API_TOKEN"# No change
312
321
313
322
[sinks.debug.encoding]
314
323
codec = "json"
@@ -332,9 +341,9 @@ file= 'example.vrl' # See above
332
341
type = "axiom"
333
342
compression = "gzip"# Set the compression algorithm
0 commit comments