Move your data from Confluent Cloud to Delta Lake, continuously

Continuously ingest data from different sources, transform data on-the-fly, and then deliver data to any destinations using RisingWave’s connectors.
Confluent Cloud
→
RisingWave
→
Delta Lake
Confluent Cloud
↓-
RisingWave
↓-
Delta Lake
Confluent Cloud
|
CREATE SOURCE IF NOT EXISTS orders_rw (
    order_id INTEGER PRIMARY KEY,
    customer_id INTEGER,
    order_status VARCHAR,
    total_amount DECIMAL,
    last_updated TIMESTAMP)
WITH (
    connector = 'kafka',
    topic = 'topic_0',
    properties.bootstrap.server = 'xyz-x00xx.us-east-1.aws.confluent.cloud:9092',
    scan.startup.mode = 'earliest',
    properties.security.protocol = 'SASL_SSL',
    properties.sasl.mechanism = 'PLAIN',
    properties.sasl.username = 'username',
    properties.sasl.password = 'password'
) FORMAT PLAIN ENCODE JSON;
For the detailed steps to ingest data from Confluent Cloud, please refer to the Confluent Cloud documentation.
|
RisingWave
|
CREATE SINK dl_sink AS
SELECT
    order_status,
    COUNT(*) as order_count,
    SUM(total_amount) as total_revenue,
    AVG(total_amount) as avg_order_value,
    MIN(last_updated) as first_order_time,
    MAX(last_updated) as last_order_time
FROM orders_rw
WITH (
    connector = 'deltalake',
    type = 'append-only',
    location = 's3a://my-delta-lake-bucket/path/to/table',
    s3.endpoint = 'https://s3.ap-southeast-1.amazonaws.com',
    s3.access.key = 'access_key',
    s3.secret.key = 'secret_key'
);
For comprehensive configuration details, please refer to the Delta Lake connector documentation.
|
Delta Lake
The Modern Backbone for Your
Event-Driven Infrastructure
GitHubXLinkedInSlackYouTube
Sign up for our to stay updated.