-
Notifications
You must be signed in to change notification settings - Fork 275
RisingWave
RisingWave is a distributed streaming database offering a standard SQL interface compatible with the PostgreSQL ecosystem, allowing integration without code changes. RisingWave treats streams as tables, enabling users to write complex queries on both streaming and historical data in an elegant manner. With RisingWave, users can focus on query analysis logic without needing to learn Java or the underlying API of a specific system.
This article will introduce how to import data from AutoMQ into the RisingWave database through RisingWave Cloud.
Refer to Deploy Locally▸ to deploy AutoMQ, ensuring network connectivity between AutoMQ and RisingWave.
Quickly create a Topic named example_topic
in AutoMQ and write a test JSON message into it by following these steps.
Use the Apache Kafka command line tool to create the Topic. Ensure you have access to a Kafka environment and the Kafka service is running. Below is an example command to create the Topic:
./kafka-topics.sh --create --topic exampleto_topic --bootstrap-server 10.0.96.4:9092 --partitions 1 --replication-factor 1
When executing the command, replace topic
and bootstrap-server
with the actual Kafka server address in use.
After creating the Topic, you can use the following command to verify whether the Topic has been successfully created.
./kafka-topics.sh --describe example_topic --bootstrap-server 10.0.96.4:9092
Generate test data in JSON format that corresponds to the table mentioned earlier.
{
"id": 1,
"name": "Test User"
"timestamp": "2023-11-10T12:00:00",
"status": "active"
}
Use Kafka's command line tool or programming methods to write the test data into the Topic named example_topic. Below is an example using the command line tool:
echo '{"id": 1, "name": "Test User", "timestamp": "2023-11-10T12:00:00", "status": "active"}' | sh kafka-console-producer.sh --broker-list 10.0.96.4:9092 --topic example_topic
Use the following command to view the data just written to the topic:
sh kafka-console-consumer.sh --bootstrap-server 10.0.96.4:9092 --topic example_topic --from-beginning
When executing the command, replace the topic
and bootstrap-server
with the actual Kafka server address.
-
Go to RisingWave Cloud Clusters to create a cluster.
-
Navigate to RisingWave Cloud Source to create a source.
-
Specify the cluster and database, and log in to the database.
-
AutoMQ is 100% compatible with Apache Kafka®, so simply click Create source and select Kafka.
-
Configure the connector according to the RisingWave Cloud guided interface, setting the source information and schema details.
-
Confirm the generated SQL statement and click Confirm to complete the creation of the source.
AutoMQ's default port is 9092, and SSL is not enabled by default. To enable SSL, please refer to the Apache Kafka Documentation.
In this example, you can set the startup mode to earliest and use JSON format to access all data from the topic from the beginning.
-
Navigate to the RisingWave Cloud Console and log into your cluster.
-
Run the following SQL statement to access the imported data, replacing the variable your_source_name with the custom name specified when creating the source.
SELECT * from {your_source_name} limit 1;
- What is automq: Overview
- Difference with Apache Kafka
- Difference with WarpStream
- Difference with Tiered Storage
- Compatibility with Apache Kafka
- Licensing
- Deploy Locally
- Cluster Deployment on Linux
- Cluster Deployment on Kubernetes
- Example: Produce & Consume Message
- Example: Simple Benchmark
- Example: Partition Reassignment in Seconds
- Example: Self Balancing when Cluster Nodes Change
- Example: Continuous Data Self Balancing
-
S3stream shared streaming storage
-
Technical advantage
- Deployment: Overview
- Runs on Cloud
- Runs on CEPH
- Runs on CubeFS
- Runs on MinIO
- Runs on HDFS
- Configuration
-
Data analysis
-
Object storage
-
Kafka ui
-
Observability
-
Data integration