You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[This connector](https://github.com/quixio/quix-samples/tree/main/python/destinations/s3-file) demonstrates how to consume data from a Kafka topic and write it to an AWS S3 bucket.
4
+
5
+
## How to run
6
+
7
+
Create a [Quix](https://portal.platform.quix.io/signup?xlink=github) account or log-in and visit the `Connectors` tab to use this connector.
8
+
9
+
Clicking `Set up connector` allows you to enter your connection details and runtime parameters.
10
+
11
+
Then either:
12
+
* click `Test connection & deploy` to deploy the pre-built and configured container into Quix.
13
+
14
+
* or click `Customise connector` to inspect or alter the code before deployment.
15
+
16
+
## Environment Variables
17
+
18
+
The connector uses the following environment variables (which generally correspond to the
Unless explicitly defined, these are optional, or generally set to the [`S3FileSink`](https://quix.io/docs/quix-streams/connectors/sinks/amazon-s3-sink.html) defaults.
31
+
32
+
-`S3_BUCKET_DIRECTORY`: An optional path within the S3 bucket to use.
33
+
**Default**: "" (root)
34
+
-`FILE_FORMAT`: The file format to publish data as; options: \[parquet, json\].
35
+
**Default**: "parquet"
36
+
37
+
38
+
## Requirements / Prerequisites
39
+
40
+
You will need the appropriate AWS features and access to use this connector.
41
+
42
+
## Contribute
43
+
44
+
Submit forked projects to the Quix [GitHub](https://github.com/quixio/quix-samples) repo. Any new project that we accept will be attributed to you and you'll receive $200 in Quix credit.
45
+
46
+
## Open Source
47
+
48
+
This project is open source under the Apache 2.0 license and available in our [GitHub](https://github.com/quixio/quix-samples) repo. Please star us and mention us on social to show your appreciation.
0 commit comments