Setting Up a Local Streaming Data Stack with Docker
Get a complete streaming data stack running locally in minutes: RisingWave (streaming database) + Kafka (event streaming) + Grafana (dashboards). All via Docker Compose.
Docker Compose
version: '3.8'
services:
risingwave:
image: risingwavelabs/risingwave:latest
ports:
- '4566:4566'
- '5691:5691'
kafka:
image: confluentinc/cp-kafka:latest
ports:
- '9092:9092'
environment:
KAFKA_NODE_ID: 1
KAFKA_PROCESS_ROLES: broker,controller
KAFKA_CONTROLLER_QUORUM_VOTERS: 1@kafka:29093
KAFKA_LISTENERS: PLAINTEXT://0.0.0.0:9092,CONTROLLER://0.0.0.0:29093
KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:9092
KAFKA_CONTROLLER_LISTENER_NAMES: CONTROLLER
CLUSTER_ID: local-cluster-1
grafana:
image: grafana/grafana:latest
ports:
- '3000:3000'
Quick Start
docker compose up -d
psql -h localhost -p 4566 -d dev -U root
-- Create Kafka source
CREATE SOURCE events (...) WITH (connector='kafka', topic='test', properties.bootstrap.server='kafka:9092') FORMAT PLAIN ENCODE JSON;
-- Create view
CREATE MATERIALIZED VIEW summary AS SELECT event_type, COUNT(*) FROM events GROUP BY event_type;
Open Grafana at localhost:3000, add PostgreSQL data source pointing to localhost:4566.
Frequently Asked Questions
How long does this take to set up?
Under 5 minutes. Docker pulls images (~2 min), services start (~30 sec), then you're ready to create sources and views.
Can I use this for production?
This setup is for development and testing. For production, deploy RisingWave on Kubernetes with proper resource allocation and S3 storage.

