Skip to content

Stream processing: fix remaining vale/markdownlint errors #2016

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions stream-processing/changelog.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,18 +37,18 @@ For conditionals, added the new _@record_ functions:
| `@record.time()` | Returns the record timestamp. |
| `@record.contains(key)` | Returns `true` or false if `key` exists in the record, or `false` if not. |

### IS NULL, IS NOT NULL
### `IS NULL` and `IS NOT NULL`

Added `IS NULL` and `IS NOT NULL` statements to determine whether an existing key in a record has a null value. For example:

```sql
SELECT * FROM STREAM:test WHERE key3['sub1'] IS NOT NULL;
```

For more details, see [Check Keys and NULL values](../stream-processing/getting-started/check-keys-null-values.md).
For more details, see [Check keys and null values](../stream-processing/getting-started/check-keys-null-values.md).

## Fluent Bit v1.1

> Release date: May 09, 2019
> Release date: 2019-05-09

Added the stream processor to Fluent Bit.
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,8 @@ SELECT * FROM STREAM:test WHERE phone IS NOT NULL;
## Check if a key exists

You can also confirm whether a certain key exists in a record at all, regardless of its value. Fluent Bit provides specific record functions that you can use in the condition part of the SQL statement. The following function determines whether `key` exists in a record:
```text

```sql
@record.contains(key)
```

Expand Down
9 changes: 5 additions & 4 deletions stream-processing/getting-started/fluent-bit-sql.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ A `SELECT` statement not associated with stream creation will send the results t

You can filter the results of this query by applying a condition by using a `WHERE` statement. For information about the `WINDOW` and `GROUP BY` statements, see [Aggregation functions](#aggregation-functions).

#### Examples
#### Examples [#select-examples]

Selects all keys from records that originate from a stream called `apache`:

Expand All @@ -50,7 +50,7 @@ CREATE STREAM stream_name

Creates a new stream of data using the results from a `SELECT` statement. If the `Tag` property in the `WITH` statement is set, this new stream can optionally be re-ingested into the Fluent Bit pipeline.

#### Examples
#### Examples [#create-stream-examples]

Creates a new stream called `hello_` from a stream called `apache`:

Expand Down Expand Up @@ -101,6 +101,7 @@ Returns the minimum value of a key in a set of records.
```sql
SELECT MAX(key) FROM STREAM:apache;
```

Returns the maximum value of a key in a set of records.

### `SUM`
Expand All @@ -111,7 +112,7 @@ SELECT SUM(key) FROM STREAM:apache;

Calculates the sum of all values of a key in a set of records.

## Time Functions
## Time functions

Use time functions to add a new key with time data into a record.

Expand All @@ -131,7 +132,7 @@ SELECT UNIX_TIMESTAMP() FROM STREAM:apache;

Adds the current Unix time to a record. Output example: `1552196165`.

## Record Functions
## Record functions

Use record functions to append new keys to a record using values from the record's context.

Expand Down
2 changes: 1 addition & 1 deletion stream-processing/introduction.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Introduction to stream processing

![](../.gitbook/assets/stream_processor.png)
![Fluent Bit stream processing](../.gitbook/assets/stream_processor.png)

Fluent Bit is a fast and flexible log processor that collects, parsers, filters, and delivers logs to remote databases, where data analysis can then be performed.

Expand Down
4 changes: 2 additions & 2 deletions stream-processing/overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ To understand how stream processing works in Fluent Bit, follow this overview of

Most of the phases in the pipeline are implemented through plugins: input, filter, and output.

![](../.gitbook/assets/flb_pipeline.png)
![Fluent Bit pipeline flow](../.gitbook/assets/flb_pipeline.png)

Filters can perform specific record modifications like appending or removing a key, enriching with metadata (for example, Kubernetes filter), or discarding records based on specific conditions. After data is stored, no further modifications are made, but records can optionally be redirected to the stream processor.

Expand All @@ -20,7 +20,7 @@ The stream processor is an independent subsystem that checks for new records hit

Every input instance is considered a stream. These streams collect data and ingest records into the pipeline.

![](../.gitbook/assets/flb_pipeline_sp.png)
![Fluent Bit pipeline flow plus stream processor](../.gitbook/assets/flb_pipeline_sp.png)

By configuring specific SQL queries, you can perform specific tasks like key selections, filtering, and data aggregation. Keep in mind that there is no database; everything is schema-less and happens in memory. Concepts like tables that are common in relational database don't exist in Fluent Bit.

Expand Down
6 changes: 3 additions & 3 deletions stream-processing/stream-processing.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# Introduction

![](../.gitbook/assets/stream_processor.png)
![Fluent Bit stream processing](../.gitbook/assets/stream_processor.png)

[Fluent Bit](https://fluentbit.io) is a fast and flexible Log processor that aims to collect, parse, filter and deliver logs to remote databases, so Data Analysis can be performed.
[Fluent Bit](https://fluentbit.io) is a fast and flexible log processor that aims to collect, parse, filter, and deliver logs to remote databases so data analysis can be performed.

Data Analysis usually happens after the data is stored and indexed in a database, but for real-time and complex analysis needs, process the data while it's still in motion in the Log processor brings a lot of advantages and this approach is called **Stream Processing on the Edge**.
Data analysis usually happens after the data is stored and indexed in a database. However, for real-time and complex analysis needs, processing the data while it's still in motion in the log processor brings a lot of advantages. This approach is called **Stream Processing on the Edge**.
1 change: 1 addition & 0 deletions vale-styles/FluentBit/Headings.yml
Original file line number Diff line number Diff line change
Expand Up @@ -112,6 +112,7 @@ exceptions:
- SignalFx
- SIMD
- Slack
- SQL
- SSL
- StatsD
- Studio
Expand Down
2 changes: 2 additions & 0 deletions vale-styles/FluentBit/Spelling-exceptions.txt
Original file line number Diff line number Diff line change
Expand Up @@ -193,6 +193,8 @@ stdout
strftime
subcommand
subcommands
subkey
subkeys
subquery
subrecord
substring
Expand Down