Building Real-Time APIs with Streaming Materialized Views

Building Real-Time APIs with Streaming Materialized Views

Building Real-Time APIs with Streaming Materialized Views

Real-time APIs serve always-current data to applications. Instead of querying a batch-refreshed cache (Redis) or hitting your OLTP database with expensive aggregation queries, query a streaming materialized view via PostgreSQL — pre-computed, always fresh, sub-20ms latency.

Architecture

Event Sources → RisingWave → Materialized Views → API Backend → Frontend
                                    ↑
                          PostgreSQL protocol
                          (any PG driver)

API Implementation

# FastAPI + RisingWave
from fastapi import FastAPI
import psycopg2

app = FastAPI()
pool = psycopg2.pool.ThreadedConnectionPool(1, 10,
    host='risingwave', port=4566, dbname='dev', user='root')

@app.get('/api/dashboard')
async def get_dashboard():
    conn = pool.getconn()
    cursor = conn.cursor()
    cursor.execute('SELECT * FROM dashboard_metrics')
    result = [dict(zip([d[0] for d in cursor.description], row))
              for row in cursor.fetchall()]
    pool.putconn(conn)
    return result

@app.get('/api/user/{user_id}/activity')
async def get_user_activity(user_id: int):
    conn = pool.getconn()
    cursor = conn.cursor()
    cursor.execute('SELECT * FROM user_activity WHERE user_id = %s', (user_id,))
    result = cursor.fetchone()
    pool.putconn(conn)
    return dict(zip([d[0] for d in cursor.description], result)) if result else {}

Why Not Redis?

AspectRedis CacheRisingWave MV
Data freshnessManual invalidationAutomatic (streaming)
AggregationsApplication computesSQL (pre-computed)
Cache invalidationError-proneNot needed
Query languageRedis commandsStandard SQL
LatencySub-ms10-20ms

Frequently Asked Questions

Is 10-20ms too slow for an API?

No. Most web APIs have 50-200ms latency including network. 10-20ms query latency from RisingWave is well within API SLA requirements and eliminates cache invalidation complexity.

Can RisingWave handle API-level concurrency?

Yes. RisingWave serves queries via PostgreSQL protocol with connection pooling. For very high concurrency (10K+ QPS), add a connection pooler like PgBouncer.

Best-in-Class Event Streaming
for Agents, Apps, and Analytics
GitHubXLinkedInSlackYouTube
Sign up for our to stay updated.