RisingWave optimizes ad spend by maintaining continuously updated ROAS, CPA, and budget pacing metrics across all channels in real time. Instead of waiting for daily reports to reallocate budgets, marketing teams can automate spend decisions based on streaming SQL signals that reflect current performance — reducing wasted spend and improving return on ad investment by acting on opportunities within minutes.
The Cost of Latency in Ad Spend Decisions
Every dollar of ad spend is a bet. You are betting that a given channel, campaign, and creative will deliver returns greater than its cost. The quality of that bet depends entirely on how fresh your performance data is.
With hourly or daily batch analytics, you make budget decisions based on yesterday's performance. A campaign that started underperforming at 9 AM gets paused at 11 AM — after two hours of wasted spend. A high-performing channel that hit its daily budget limit at noon receives no additional allocation, even though it has a 400% ROAS and the underperforming channel still has budget.
Streaming SQL eliminates this lag. RisingWave maintains ROAS, pacing ratios, and budget utilization as live materialized views. Budget optimization logic running against these views acts on current performance, not historical averages.
Setting Up the Spend Analytics Pipeline
Ingest ad events and spend records from Kafka and your ad server database:
CREATE SOURCE ad_spend_events (
event_id VARCHAR,
campaign_id VARCHAR,
channel VARCHAR,
ad_group_id VARCHAR,
spend_usd DOUBLE PRECISION,
impressions BIGINT,
clicks BIGINT,
event_time TIMESTAMPTZ
)
WITH (
connector = 'kafka',
topic = 'ads.spend.events',
properties.bootstrap.server = 'kafka:9092',
scan.startup.mode = 'latest'
)
FORMAT PLAIN ENCODE JSON;
CREATE SOURCE revenue_events (
order_id VARCHAR,
campaign_id VARCHAR,
channel VARCHAR,
revenue_usd DOUBLE PRECISION,
event_time TIMESTAMPTZ
)
WITH (
connector = 'kafka',
topic = 'ads.revenue.events',
properties.bootstrap.server = 'kafka:9092',
scan.startup.mode = 'latest'
)
FORMAT PLAIN ENCODE JSON;
-- Campaign budget targets from your ad ops database
CREATE SOURCE campaign_budgets
WITH (
connector = 'postgres-cdc',
hostname = 'adops-postgres',
port = '5432',
username = 'rw_reader',
password = 'secret',
database.name = 'adops',
schema.name = 'public',
table.name = 'campaign_budgets'
);
Real-Time ROAS and Efficiency Monitoring
The core optimization view joins spend with revenue in rolling hourly windows:
CREATE MATERIALIZED VIEW channel_roas_realtime AS
SELECT
s.channel,
s.campaign_id,
window_start,
window_end,
SUM(s.spend_usd) AS spend_usd,
SUM(s.impressions) AS impressions,
SUM(s.clicks) AS clicks,
COUNT(r.order_id) AS conversions,
SUM(r.revenue_usd) AS revenue_usd,
-- Core efficiency metrics
SUM(r.revenue_usd) / NULLIF(SUM(s.spend_usd), 0) AS roas,
SUM(s.spend_usd) / NULLIF(COUNT(r.order_id), 0) AS cpa_usd,
SUM(s.clicks)::DOUBLE PRECISION / NULLIF(SUM(s.impressions), 0) AS ctr,
-- Profit margin on ad spend
(SUM(r.revenue_usd) - SUM(s.spend_usd)) / NULLIF(SUM(s.spend_usd), 0) AS profit_on_spend
FROM TUMBLE(ad_spend_events, event_time, INTERVAL '1 hour') s
LEFT JOIN revenue_events r
ON s.campaign_id = r.campaign_id
AND r.event_time BETWEEN s.event_time AND s.event_time + INTERVAL '7 days'
GROUP BY s.channel, s.campaign_id, window_start, window_end;
This view answers the question "what is my ROAS right now?" — not "what was my ROAS yesterday?" The difference in decision quality is significant.
Budget Pacing and Optimization Signals
Build a pacing view that compares actual spend against ideal linear pacing:
CREATE MATERIALIZED VIEW budget_pacing_signals AS
SELECT
s.campaign_id,
s.channel,
window_start,
window_end,
SUM(s.spend_usd) AS spend_today,
b.daily_budget_usd,
b.daily_budget_usd - SUM(s.spend_usd) AS remaining_budget,
-- Ideal spend at this point in the day (linear pacing)
b.daily_budget_usd * (
EXTRACT(HOUR FROM CURRENT_TIMESTAMP) + 1
) / 24.0 AS ideal_spend_now,
-- Pacing ratio: >1.0 means overpacing, <1.0 means underpacing
SUM(s.spend_usd) / NULLIF(
b.daily_budget_usd * (EXTRACT(HOUR FROM CURRENT_TIMESTAMP) + 1) / 24.0, 0
) AS pacing_ratio,
-- Action signal
CASE
WHEN SUM(s.spend_usd) >= b.daily_budget_usd * 0.95 THEN 'budget_exhausted'
WHEN SUM(s.spend_usd) / NULLIF(b.daily_budget_usd * (EXTRACT(HOUR FROM CURRENT_TIMESTAMP) + 1) / 24.0, 0) > 1.3
THEN 'overpacing_pause'
WHEN SUM(s.spend_usd) / NULLIF(b.daily_budget_usd * (EXTRACT(HOUR FROM CURRENT_TIMESTAMP) + 1) / 24.0, 0) < 0.7
THEN 'underpacing_boost'
ELSE 'on_track'
END AS pacing_action
FROM TUMBLE(ad_spend_events, event_time, INTERVAL '1 day') s
JOIN campaign_budgets b ON s.campaign_id = b.campaign_id
GROUP BY s.campaign_id, s.channel, window_start, window_end, b.daily_budget_usd;
The pacing_action column drives automated budget adjustments. A downstream service reads this view and calls ad platform APIs to adjust bids or pause campaigns — acting on current pacing data, not yesterday's summary.
Comparison: Ad Spend Optimization Approaches
| Approach | Optimization Latency | Budget Waste Window | Automation Capability | Setup Cost |
| Manual daily review | 24 hours | Full day | None | None |
| Ad platform auto-bidding | Minutes (black box) | Hours | Platform-only | Low |
| Hourly reporting + rules | 1–2 hours | 1–2 hours of waste | Partial | Medium |
| RisingWave streaming SQL | Seconds | Minutes of waste | Full (custom logic) | Low |
Channel Budget Reallocation Signals
Identify channels that are underperforming vs. the portfolio and flag them for budget reallocation:
CREATE MATERIALIZED VIEW spend_reallocation_signals AS
SELECT
channel,
campaign_id,
window_start,
SUM(spend_usd) AS spend_usd,
SUM(revenue_usd) AS revenue_usd,
AVG(roas) AS avg_roas,
-- Compare to portfolio ROAS in same window
AVG(roas) / NULLIF(
AVG(AVG(roas)) OVER (PARTITION BY window_start), 0
) AS roas_vs_portfolio,
CASE
WHEN AVG(roas) / NULLIF(AVG(AVG(roas)) OVER (PARTITION BY window_start), 0) > 1.25
THEN 'increase_budget'
WHEN AVG(roas) / NULLIF(AVG(AVG(roas)) OVER (PARTITION BY window_start), 0) < 0.75
THEN 'decrease_budget'
ELSE 'maintain'
END AS reallocation_signal
FROM channel_roas_realtime
GROUP BY channel, campaign_id, window_start;
Automation consuming this view reallocates budget from decrease_budget campaigns to increase_budget campaigns — continuously, throughout the day, based on live performance.
Pushing Optimization Actions Downstream
Sink pacing and reallocation signals to Kafka for consumption by automation services:
CREATE SINK spend_optimization_actions
FROM budget_pacing_signals
WITH (
connector = 'kafka',
properties.bootstrap.server = 'kafka:9092',
topic = 'ads.optimization.actions'
)
FORMAT UPSERT ENCODE JSON (
force_append_only = false
);
FAQ
Q: How does RisingWave complement ad platform auto-bidding algorithms? A: Ad platform auto-bidding optimizes within a single platform. RisingWave gives you cross-platform visibility — enabling portfolio-level budget reallocation between Google, Meta, programmatic, and other channels. The two approaches work together: let platform auto-bidding handle intra-campaign optimization while RisingWave handles cross-channel portfolio management.
Q: Can I use RisingWave to enforce brand safety budget guardrails?
A: Yes. Build a materialized view that tracks spend against a safety threshold. When cumulative spend on a flagged publisher or category approaches the limit, emit a pause signal to your ad server. This creates a real-time safety circuit breaker that acts in seconds rather than waiting for a daily audit.
Q: How accurate is the ROAS attribution in a streaming pipeline vs. a warehouse? A: Attribution accuracy depends on your attribution window and join logic, not the processing architecture. RisingWave applies the same attribution logic as a warehouse — the difference is that results update continuously. For multi-touch attribution with long windows, a hybrid approach (streaming for operational signals, warehouse for formal reporting) is common.
Q: What ad platforms can RisingWave connect to directly? A: RisingWave connects via Kafka, PostgreSQL CDC, and MySQL CDC. Most ad platforms expose reporting data via API, not event streams — use a connector like Airbyte to pull spend data into Kafka or your operational database, then source it into RisingWave.
Q: How do I handle currency conversion for multi-currency campaigns?
A: Create a fx_rates reference table with currency pairs and rates. Join spend events against this table using the FX rate valid at event time using a temporal join pattern (FOR SYSTEM_TIME AS OF), converting all spend to a base currency before aggregation.
Get Started
Build your real-time ad spend optimization pipeline with the RisingWave quickstart guide.
Join marketing engineers optimizing ad budgets in the RisingWave Slack community.

