Skip to content

Commit a056b9e

Browse files
Merge pull request #2016 from fluent/alexakreizinger/sc-143167/vale-final-sweep-for-stream-processing
2 parents 1594f98 + 497365f commit a056b9e

File tree

8 files changed

+19
-14
lines changed

8 files changed

+19
-14
lines changed

stream-processing/changelog.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -37,18 +37,18 @@ For conditionals, added the new _@record_ functions:
3737
| `@record.time()` | Returns the record timestamp. |
3838
| `@record.contains(key)` | Returns `true` or false if `key` exists in the record, or `false` if not. |
3939

40-
### IS NULL, IS NOT NULL
40+
### `IS NULL` and `IS NOT NULL`
4141

4242
Added `IS NULL` and `IS NOT NULL` statements to determine whether an existing key in a record has a null value. For example:
4343

4444
```sql
4545
SELECT * FROM STREAM:test WHERE key3['sub1'] IS NOT NULL;
4646
```
4747

48-
For more details, see [Check Keys and NULL values](../stream-processing/getting-started/check-keys-null-values.md).
48+
For more details, see [Check keys and null values](../stream-processing/getting-started/check-keys-null-values.md).
4949

5050
## Fluent Bit v1.1
5151

52-
> Release date: May 09, 2019
52+
> Release date: 2019-05-09
5353
5454
Added the stream processor to Fluent Bit.

stream-processing/getting-started/check-keys-null-values.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,8 @@ SELECT * FROM STREAM:test WHERE phone IS NOT NULL;
2323
## Check if a key exists
2424

2525
You can also confirm whether a certain key exists in a record at all, regardless of its value. Fluent Bit provides specific record functions that you can use in the condition part of the SQL statement. The following function determines whether `key` exists in a record:
26-
```text
26+
27+
```sql
2728
@record.contains(key)
2829
```
2930

stream-processing/getting-started/fluent-bit-sql.md

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ A `SELECT` statement not associated with stream creation will send the results t
2626

2727
You can filter the results of this query by applying a condition by using a `WHERE` statement. For information about the `WINDOW` and `GROUP BY` statements, see [Aggregation functions](#aggregation-functions).
2828

29-
#### Examples
29+
#### Examples [#select-examples]
3030

3131
Selects all keys from records that originate from a stream called `apache`:
3232

@@ -50,7 +50,7 @@ CREATE STREAM stream_name
5050

5151
Creates a new stream of data using the results from a `SELECT` statement. If the `Tag` property in the `WITH` statement is set, this new stream can optionally be re-ingested into the Fluent Bit pipeline.
5252

53-
#### Examples
53+
#### Examples [#create-stream-examples]
5454

5555
Creates a new stream called `hello_` from a stream called `apache`:
5656

@@ -101,6 +101,7 @@ Returns the minimum value of a key in a set of records.
101101
```sql
102102
SELECT MAX(key) FROM STREAM:apache;
103103
```
104+
104105
Returns the maximum value of a key in a set of records.
105106

106107
### `SUM`
@@ -111,7 +112,7 @@ SELECT SUM(key) FROM STREAM:apache;
111112

112113
Calculates the sum of all values of a key in a set of records.
113114

114-
## Time Functions
115+
## Time functions
115116

116117
Use time functions to add a new key with time data into a record.
117118

@@ -131,7 +132,7 @@ SELECT UNIX_TIMESTAMP() FROM STREAM:apache;
131132

132133
Adds the current Unix time to a record. Output example: `1552196165`.
133134

134-
## Record Functions
135+
## Record functions
135136

136137
Use record functions to append new keys to a record using values from the record's context.
137138

stream-processing/introduction.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# Introduction to stream processing
22

3-
![](../.gitbook/assets/stream_processor.png)
3+
![Fluent Bit stream processing](../.gitbook/assets/stream_processor.png)
44

55
Fluent Bit is a fast and flexible log processor that collects, parsers, filters, and delivers logs to remote databases, where data analysis can then be performed.
66

stream-processing/overview.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ To understand how stream processing works in Fluent Bit, follow this overview of
1010

1111
Most of the phases in the pipeline are implemented through plugins: input, filter, and output.
1212

13-
![](../.gitbook/assets/flb_pipeline.png)
13+
![Fluent Bit pipeline flow](../.gitbook/assets/flb_pipeline.png)
1414

1515
Filters can perform specific record modifications like appending or removing a key, enriching with metadata (for example, Kubernetes filter), or discarding records based on specific conditions. After data is stored, no further modifications are made, but records can optionally be redirected to the stream processor.
1616

@@ -20,7 +20,7 @@ The stream processor is an independent subsystem that checks for new records hit
2020

2121
Every input instance is considered a stream. These streams collect data and ingest records into the pipeline.
2222

23-
![](../.gitbook/assets/flb_pipeline_sp.png)
23+
![Fluent Bit pipeline flow plus stream processor](../.gitbook/assets/flb_pipeline_sp.png)
2424

2525
By configuring specific SQL queries, you can perform specific tasks like key selections, filtering, and data aggregation. Keep in mind that there is no database; everything is schema-less and happens in memory. Concepts like tables that are common in relational database don't exist in Fluent Bit.
2626

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
# Introduction
22

3-
![](../.gitbook/assets/stream_processor.png)
3+
![Fluent Bit stream processing](../.gitbook/assets/stream_processor.png)
44

5-
[Fluent Bit](https://fluentbit.io) is a fast and flexible Log processor that aims to collect, parse, filter and deliver logs to remote databases, so Data Analysis can be performed.
5+
[Fluent Bit](https://fluentbit.io) is a fast and flexible log processor that aims to collect, parse, filter, and deliver logs to remote databases so data analysis can be performed.
66

7-
Data Analysis usually happens after the data is stored and indexed in a database, but for real-time and complex analysis needs, process the data while it's still in motion in the Log processor brings a lot of advantages and this approach is called **Stream Processing on the Edge**.
7+
Data analysis usually happens after the data is stored and indexed in a database. However, for real-time and complex analysis needs, processing the data while it's still in motion in the log processor brings a lot of advantages. This approach is called **Stream Processing on the Edge**.

vale-styles/FluentBit/Headings.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -112,6 +112,7 @@ exceptions:
112112
- SignalFx
113113
- SIMD
114114
- Slack
115+
- SQL
115116
- SSL
116117
- StatsD
117118
- Studio

vale-styles/FluentBit/Spelling-exceptions.txt

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -193,6 +193,8 @@ stdout
193193
strftime
194194
subcommand
195195
subcommands
196+
subkey
197+
subkeys
196198
subquery
197199
subrecord
198200
substring

0 commit comments

Comments
 (0)