Ingest data from Apache Kafka
An open-source distributed event streaming platform
Quick Start
Connect in minutes with SQL
Use CREATE SOURCE or CREATE TABLE to ingest data from Apache Kafka into RisingWave. No plugins, no middleware — just PostgreSQL-compatible SQL.
CREATE SOURCE IF NOT EXISTS orders_rw (
order_id INTEGER PRIMARY KEY,
customer_id INTEGER,
order_status VARCHAR,
total_amount DECIMAL,
last_updated TIMESTAMP)
WITH (
connector='kafka',
topic='demo_topic',
properties.bootstrap.server='172.10.1.1:9090,172.10.1.2:9090',
scan.startup.mode='latest',
scan.startup.timestamp.millis='140000000'
) FORMAT PLAIN ENCODE AVRO (
message = 'message_name',
schema.registry = 'http://127.0.0.1:8081'
);For comprehensive configuration details, please refer to the Kafka connector documentation.
Capabilities
What you can do with RisingWave + Apache Kafka
Real-time Ingestion
Continuously stream data from Apache Kafka into RisingWave with sub-second latency. Process millions of events per second.
SQL Transformations
Join Apache Kafka data with other sources, apply windowing, aggregation, and filtering — all in standard SQL.
Materialized Views
Create incrementally maintained materialized views over Apache Kafka data. Always fresh, always queryable.
Multi-format Support
Supports Avro, JSON, Protobuf, CSV, and more. Compatible with Schema Registry for schema evolution.
Resources
Learn more
Complete configuration reference, authentication options, and advanced features for the Apache Kafka source connector.
RisingWave also supports Apache Kafka as a destination. View the destination connector.
Get RisingWave running locally in 5 minutes and try your first streaming pipeline.
Start streaming in minutes
Connect to Apache Kafka with just a few lines of SQL. No infrastructure to manage, no code to write.