Ingest data from Confluent Cloud
Confluent Cloud is a fully managed, cloud-native data streaming platform that uses Apache Kafka and Apache Flink.
Quick Start
Connect in minutes with SQL
Use CREATE SOURCE or CREATE TABLE to ingest data from Confluent Cloud into RisingWave. No plugins, no middleware — just PostgreSQL-compatible SQL.
CREATE SOURCE IF NOT EXISTS orders_rw (
order_id INTEGER PRIMARY KEY,
customer_id INTEGER,
order_status VARCHAR,
total_amount DECIMAL,
last_updated TIMESTAMP)
WITH (
connector = 'kafka',
topic = 'topic_0',
properties.bootstrap.server = 'xyz-x00xx.us-east-1.aws.confluent.cloud:9092',
scan.startup.mode = 'earliest',
properties.security.protocol = 'SASL_SSL',
properties.sasl.mechanism = 'PLAIN',
properties.sasl.username = 'username',
properties.sasl.password = 'password'
) FORMAT PLAIN ENCODE JSON;For the detailed steps to ingest data from Confluent Cloud, please refer to the Confluent Cloud documentation.
Capabilities
What you can do with RisingWave + Confluent Cloud
Real-time Ingestion
Continuously stream data from Confluent Cloud into RisingWave with sub-second latency. Process millions of events per second.
SQL Transformations
Join Confluent Cloud data with other sources, apply windowing, aggregation, and filtering — all in standard SQL.
Materialized Views
Create incrementally maintained materialized views over Confluent Cloud data. Always fresh, always queryable.
Multi-format Support
Supports Avro, JSON, Protobuf, CSV, and more. Compatible with Schema Registry for schema evolution.
Start streaming in minutes
Connect to Confluent Cloud with just a few lines of SQL. No infrastructure to manage, no code to write.