Real-time conversion tracking with a streaming database means every purchase, form submission, and app install is reflected in your performance reports within seconds—not hours. RisingWave maintains continuously updated conversion metrics as materialized views, giving media buyers, growth teams, and product managers live visibility into what's working right now.
Why Conversion Tracking Latency Hurts ROI
Conversion data is the ground truth of marketing performance. Ad platforms, bidding algorithms, and attribution models all depend on it. When conversion data is 24 hours stale, every downstream decision is compromised.
The practical consequences are severe:
- Smart bidding algorithms (Google, Meta) need conversion data to optimize bids. Delayed signals mean the algorithm is learning from old data and bidding suboptimally today
- Budget allocation decisions get made in morning standups based on yesterday's numbers
- A/B test velocity slows because you need more days to accumulate the statistical power that real-time data would reveal faster
- Bid adjustments for high-converting placements are delayed while budget flows to underperformers
A streaming database eliminates these delays by treating every conversion event as a trigger for immediate metric updates.
Architecture: Real-Time Conversion Pipeline
The conversion tracking pipeline ingests events from three sources: web conversion pixels (via Kafka), mobile SDK events (via Kafka), and order management system events (via Postgres CDC). Materialized views then compute conversion metrics by campaign, channel, creative, and time window.
Key design choice: conversions are written to Kafka as they happen, not batch-uploaded. This requires a server-side conversion event system (not client-side only), which also improves accuracy by avoiding ad blocker interference.
Ingesting Conversion Events
CREATE SOURCE conversion_events (
conversion_id VARCHAR,
user_id VARCHAR,
session_id VARCHAR,
campaign_id VARCHAR,
ad_id VARCHAR,
channel VARCHAR,
conversion_type VARCHAR,
revenue DECIMAL,
currency VARCHAR,
order_id VARCHAR,
event_time TIMESTAMPTZ
)
WITH (
connector = 'kafka',
topic = 'conversion-events',
properties.bootstrap.server = 'kafka:9092',
scan.startup.mode = 'latest'
)
FORMAT PLAIN ENCODE JSON;
Stream order data from your transaction database to enrich conversions:
CREATE SOURCE orders
FROM postgres-cdc WITH (
hostname = 'orders-db',
port = '5432',
username = 'replicator',
password = 'secret',
database.name = 'ecommerce',
schema.name = 'public',
table.name = 'orders',
slot.name = 'rw_conversions_slot'
);
CREATE TABLE campaign_metadata (
campaign_id VARCHAR PRIMARY KEY,
advertiser_id VARCHAR,
channel VARCHAR,
campaign_type VARCHAR,
cost_per_day DECIMAL
);
Real-Time Conversion Metrics by Campaign
CREATE MATERIALIZED VIEW campaign_conversions_live AS
SELECT
c.campaign_id,
cm.channel,
cm.campaign_type,
window_start,
window_end,
COUNT(*) FILTER (WHERE c.conversion_type = 'purchase') AS purchases,
COUNT(*) FILTER (WHERE c.conversion_type = 'signup') AS signups,
COUNT(*) FILTER (WHERE c.conversion_type = 'trial_start') AS trials,
SUM(c.revenue) FILTER (WHERE c.conversion_type = 'purchase') AS revenue,
AVG(c.revenue) FILTER (WHERE c.conversion_type = 'purchase') AS avg_order_value,
COUNT(DISTINCT c.user_id) AS unique_converters,
COUNT(DISTINCT c.order_id) AS unique_orders
FROM TUMBLE(conversion_events, event_time, INTERVAL '1 HOUR') c
LEFT JOIN campaign_metadata cm ON c.campaign_id = cm.campaign_id
GROUP BY c.campaign_id, cm.channel, cm.campaign_type, window_start, window_end;
Conversion Rate and ROAS Calculation
Join conversion data with impression counts to compute live ROAS and conversion rates:
CREATE MATERIALIZED VIEW campaign_roas_live AS
SELECT
cv.campaign_id,
cv.channel,
cv.window_start,
cv.window_end,
cv.purchases,
cv.revenue,
cv.avg_order_value,
cm.cost_per_day / 24.0 AS hourly_spend_target,
-- Return on Ad Spend
ROUND(cv.revenue / NULLIF(cm.cost_per_day / 24.0, 0), 2) AS roas,
-- Cost per acquisition estimate
ROUND((cm.cost_per_day / 24.0) / NULLIF(cv.purchases, 0), 2)
AS estimated_cpa
FROM campaign_conversions_live cv
LEFT JOIN campaign_metadata cm ON cv.campaign_id = cm.campaign_id;
Deduplication for Accurate Counting
Production conversion pipelines must deduplicate events (pixel fires twice, SDK retry). Create a deduplication layer before computing metrics:
CREATE MATERIALIZED VIEW deduped_conversions AS
SELECT DISTINCT ON (conversion_id)
conversion_id,
user_id,
campaign_id,
ad_id,
channel,
conversion_type,
revenue,
order_id,
event_time
FROM conversion_events
ORDER BY conversion_id, event_time ASC;
Comparison: Conversion Tracking Approaches
| Dimension | Client-Side Pixels (Batch) | Server-Side Batch | Streaming Database (RisingWave) |
| Reporting latency | Hours | Hours | Seconds |
| Deduplication | Platform-side only | Possible in ETL | Native (materialized view) |
| Ad blocker impact | High data loss | None (server-side) | None (server-side) |
| Attribution freshness | Delayed | Delayed | Real-time |
| ROAS visibility | End-of-day | End-of-day | Real-time |
| Bidding algorithm signals | Delayed | Delayed | Immediate |
Sinking Conversion Data to Ad Platforms and BI
Push real-time conversion signals back to your ad platforms via Kafka:
CREATE SINK conversion_signals_kafka
FROM deduped_conversions
WITH (
connector = 'kafka',
topic = 'ad-platform-conversion-signals',
properties.bootstrap.server = 'kafka:9092'
)
FORMAT PLAIN ENCODE JSON;
Write ROAS metrics to your BI database for dashboards:
CREATE SINK roas_metrics_to_bi
FROM campaign_roas_live
WITH (
connector = 'jdbc',
jdbc.url = 'jdbc:postgresql://bi-db:5432/marketing',
table.name = 'campaign_roas_live',
type = 'upsert',
primary_key = 'campaign_id,window_start'
);
FAQ
Q: How does RisingWave handle duplicate conversion events?
The DISTINCT ON pattern in the deduplication materialized view keeps only the first event per conversion_id. This handles pixel double-fires, SDK retries, and webhook re-deliveries automatically. The downstream metrics views then query the deduplicated view rather than the raw source.
Q: Can I track micro-conversions (video views, scroll depth) alongside macro-conversions?
Yes. Add micro-conversion event types to your conversion_events source schema. Use FILTER WHERE in your aggregation views to count macro and micro conversions separately. This lets you build leading-indicator metrics (micro-conversion rate) that predict macro-conversion performance.
Q: How does real-time conversion data improve smart bidding? Ad platforms like Google and Meta accept server-side conversion APIs that send conversion events in near-real-time. Feed deduped_conversions from RisingWave into the Conversions API directly. The bidding algorithm trains on fresher signals, improving bid optimization within the same campaign flight.
Q: What's the difference between this and a standard event tracking database? Standard event databases (ClickHouse, BigQuery) store and query events but require you to run queries to get current metrics. RisingWave's materialized views are always pre-computed—metrics are current without running a query, because the query ran incrementally as each event arrived.
Q: How do I handle currency conversion for multi-market campaigns?
Store exchange rates in a RisingWave table and join to conversion events using a temporal join (FOR SYSTEM_TIME AS OF event_time) to apply the correct exchange rate at conversion time. Update the exchange rate table via CDC from your FX data provider.
Get Started
Deploy real-time conversion tracking that feeds live ROAS data to your entire marketing stack.
- Documentation: docs.risingwave.com/get-started
- Community: risingwave.com/slack

