Move your data from MySQL to Apache Kafka, continuously

Continuously ingest data from different sources, transform data on-the-fly, and then deliver data to any destinations using RisingWave’s connectors.
MySQL
→
RisingWave
→
Apache Kafka
MySQL
↓-
RisingWave
↓-
Apache Kafka
MySQL
|
CREATE SOURCE mysql_source WITH (
  connector = 'mysql-cdc',
  hostname = '127.0.0.1',
  port = '8306',
  username = 'root',
  password = '123456',
  database.name = 'mydb',
  server.id = 5888
);

CREATE TABLE orders_rw (
    order_id INTEGER PRIMARY KEY,
    customer_id INTEGER,
    order_status VARCHAR,
    total_amount DECIMAL,
    last_updated TIMESTAMP,
)
FROM source my_source TABLE orders;
For comprehensive configuration details, please refer to the MySQL CDC connector documentation
|
RisingWave
|
CREATE SINK kafka_sink AS
SELECT
    order_status,
    COUNT(*) as order_count,
    SUM(total_amount) as total_revenue,
    AVG(total_amount) as avg_order_value,
    MIN(last_updated) as first_order_time,
    MAX(last_updated) as last_order_time
FROM orders_rw
WITH (
   connector='kafka',
   properties.bootstrap.server='localhost:9092',
   topic='test',
   properties.message.max.bytes = 2000
)
FORMAT PLAIN ENCODE JSON;
For comprehensive configuration details, please refer to the Kafka connector documentation.
|
Apache Kafka
The Modern Backbone for Your
Event-Driven Infrastructure
GitHubXLinkedInSlackYouTube
Sign up for our to stay updated.