Real-Time Data for Claude, ChatGPT, and Copilot: A Streaming Architecture
AI assistants like Claude, ChatGPT, and GitHub Copilot are most useful when they have access to current data — not static training snapshots. A streaming database like RisingWave provides real-time context to these assistants through the Model Context Protocol (MCP), function calling, or direct PostgreSQL queries. This guide shows how to connect your business data to AI assistants in real time.
| AI Assistant | MCP Support | Function Calling | PostgreSQL Query |
| Claude | ✅ (native) | ✅ | Via MCP/tools |
| ChatGPT | ✅ (2025+) | ✅ | Via plugins/tools |
| Copilot | ✅ (VS Code) | ✅ | Via MCP |
| Gemini | ✅ (2025+) | ✅ | Via tools |
Architecture
Business Data (CDC) → RisingWave → Materialized Views → MCP Server → AI Assistants
RisingWave ingests changes from your databases via CDC, maintains pre-computed context as materialized views, and exposes them to AI assistants via MCP or PostgreSQL protocol.
Example: Customer Support Context
CREATE MATERIALIZED VIEW support_context AS
SELECT c.id, c.name, c.plan, c.plan_updated_at,
COUNT(t.id) FILTER (WHERE t.status='open') as open_tickets,
SUM(o.amount) FILTER (WHERE o.created_at > NOW()-INTERVAL '30 days') as spend_30d
FROM customers c
LEFT JOIN tickets t ON c.id = t.customer_id
LEFT JOIN orders o ON c.id = o.customer_id
GROUP BY c.id, c.name, c.plan, c.plan_updated_at;
When Claude or ChatGPT needs customer context, they query this view — always current, sub-second.
Frequently Asked Questions
How do I connect ChatGPT to my streaming data?
Use function calling with a PostgreSQL query function that connects to RisingWave. ChatGPT calls the function, which queries a streaming materialized view, returning current data.
What is MCP and which AI tools support it?
Model Context Protocol (97M+ monthly SDK downloads) standardizes AI-to-data connections. Supported by Claude, ChatGPT, Copilot, and Gemini. RisingWave has a built-in MCP server.

