Trusted by 1,000+ Data-Driven Organizations
to harness continuous insights from both live and historical data.
-- 1. Connect to your object store
CREATE CONNECTION my_iceberg_connection WITH (
type = 'iceberg',
warehouse.path = 's3://your-bucket/iceberg-stocks',
hosted_catalog = true -- No external catalog needed!
);
-- 2. Create your Iceberg table
CREATE TABLE stock_trades (
trade_id INT PRIMARY KEY,
symbol STRING,
trade_price DOUBLE,
trade_volume INT,
trade_time TIMESTAMP
) ENGINE = iceberg;
-- 3. Stream data into your table
INSERT INTO stock_trades
SELECT * FROM your_kafka_stock_stream;
read the documentation
RisingWave offers purpose-built streaming connectors equipped with built-in intelligence to detect back pressure, enabling efficient data ingestion from numerous sources in a decentralized manner.
Live data has short shelf life. Incremental updates are triggered automatically in RisingWave to guarantee always fresh insights letting users get the most value of their data sets.
RisingWave makes data pipelines composable, allowing tables and views generated by one query to be seamlessly used as inputs for downstream queries.
Interoperability is a core design principle of RisingWave. As Iceberg and Delta increasingly become the de facto standards for data lakehouse table formats, RisingWave provides robust read and write support for both.