Real-time user acquisition analytics means you measure install quality, early revenue, and return on ad spend by channel and creative as events happen — not two days after campaign spend is sunk. RisingWave ingests attribution events and gameplay behavior from Kafka, joins them with campaign reference data, and maintains continuously updated ROAS and LTV materialized views so your UA team can pause bad campaigns and scale winners within hours.
Why UA Analytics Needs to Be Faster
Mobile and PC game user acquisition operates on compressed feedback cycles. A campaign manager launching a $50,000 creative test on Meta or Google needs to know if the installs are converting to paying players before the daily budget is exhausted — not in next week's cohort report.
Traditional UA analytics pipelines aggregate MMP (mobile measurement partner) postbacks nightly, merge them with payment events in a warehouse, and surface D1/D7 ROAS the following day. By then, low-quality traffic has already consumed budget and inflated install counts with players who churned in the first session.
Streaming analytics changes this by processing attribution and revenue events as they arrive, giving UA managers an early warning signal within the first few hours of a campaign.
Connecting Attribution and Revenue Streams
Ingest attribution postbacks forwarded from your MMP (AppsFlyer, Adjust, Kochava):
CREATE SOURCE attribution_events (
install_id VARCHAR,
player_id BIGINT,
campaign_id VARCHAR,
ad_network VARCHAR,
creative_id VARCHAR,
channel VARCHAR,
country VARCHAR,
platform VARCHAR,
install_cost_usd NUMERIC,
install_time TIMESTAMPTZ
) WITH (
connector = 'kafka',
topic = 'attribution-postbacks',
properties.bootstrap.server = 'kafka:9092',
scan.startup.mode = 'latest'
) FORMAT PLAIN ENCODE JSON;
Ingest revenue events in parallel:
CREATE SOURCE revenue_events (
player_id BIGINT,
revenue_type VARCHAR, -- 'iap', 'ad', 'subscription'
revenue_usd NUMERIC,
event_time TIMESTAMPTZ
) WITH (
connector = 'kafka',
topic = 'revenue-events',
properties.bootstrap.server = 'kafka:9092',
scan.startup.mode = 'latest'
) FORMAT PLAIN ENCODE JSON;
Early ROAS Materialized View
Build a materialized view that computes D0 and D1 revenue per campaign, enriched with spend data:
CREATE MATERIALIZED VIEW campaign_early_roas AS
SELECT
a.campaign_id,
a.ad_network,
a.channel,
a.country,
a.platform,
window_start,
window_end,
COUNT(DISTINCT a.player_id) AS installs,
SUM(a.install_cost_usd) AS total_spend,
SUM(r.revenue_usd) AS total_revenue,
CASE
WHEN SUM(a.install_cost_usd) > 0
THEN SUM(r.revenue_usd) / SUM(a.install_cost_usd)
ELSE 0
END AS roas,
CASE
WHEN COUNT(DISTINCT a.player_id) > 0
THEN SUM(a.install_cost_usd) / COUNT(DISTINCT a.player_id)
ELSE 0
END AS cpi
FROM TUMBLE(attribution_events, install_time, INTERVAL '1 hour') a
LEFT JOIN revenue_events r
ON a.player_id = r.player_id
AND r.event_time BETWEEN a.install_time AND a.install_time + INTERVAL '24 hours'
GROUP BY a.campaign_id, a.ad_network, a.channel, a.country, a.platform,
window_start, window_end;
This gives UA managers an hourly ROAS signal by campaign and channel within the first day of a new creative launch.
Install Quality Scoring
Not all installs are equal. Build a quality score based on early engagement signals:
CREATE MATERIALIZED VIEW install_quality_by_campaign AS
SELECT
a.campaign_id,
a.creative_id,
a.ad_network,
COUNT(DISTINCT a.player_id) AS installs,
COUNT(DISTINCT CASE WHEN r.revenue_usd > 0 THEN a.player_id END) AS paying_players,
AVG(r.revenue_usd) AS avg_d1_revenue,
CASE
WHEN COUNT(DISTINCT a.player_id) > 0
THEN COUNT(DISTINCT CASE WHEN r.revenue_usd > 0 THEN a.player_id END)::NUMERIC
/ COUNT(DISTINCT a.player_id)
ELSE 0
END AS d1_conversion_rate,
CASE
WHEN AVG(r.revenue_usd) >= 1.50 THEN 'high_quality'
WHEN AVG(r.revenue_usd) >= 0.30 THEN 'mid_quality'
ELSE 'low_quality'
END AS install_quality_tier
FROM TUMBLE(attribution_events, install_time, INTERVAL '6 hours') a
LEFT JOIN revenue_events r
ON a.player_id = r.player_id
AND r.event_time BETWEEN a.install_time AND a.install_time + INTERVAL '24 hours'
GROUP BY a.campaign_id, a.creative_id, a.ad_network, window_start, window_end;
Campaigns consistently generating low_quality installs are paused automatically when this view feeds an alerting pipeline.
UA Analytics Approach Comparison
| Dimension | MMP Dashboard | Warehouse (next day) | RisingWave Streaming |
| ROAS freshness | 4–6 hours | 24+ hours | < 1 hour |
| Install quality signal | Basic | D1/D7 cohorts | Same-session early signal |
| Custom attribution logic | No | SQL | SQL |
| Cross-channel unified view | Limited | Yes | Yes |
| Campaign pause automation | Manual | Manual | Automated via Kafka sink |
| LTV modeling input | Delayed | Delayed | Near real-time |
Downstream Automation
Route low-quality campaign signals to Kafka for automated bidding adjustments:
CREATE SINK low_quality_campaigns_alert
FROM install_quality_by_campaign
WITH (
connector = 'kafka',
topic = 'ua-quality-alerts',
properties.bootstrap.server = 'kafka:9092'
) FORMAT UPSERT ENCODE JSON;
A downstream consumer reads from ua-quality-alerts and calls the ad network's API to reduce bids or pause creatives when install_quality_tier = 'low_quality' and installs > 100.
FAQ
Q: MMP postbacks can arrive with significant delays. How does RisingWave handle late data? A: Configure event time watermarks in RisingWave to tolerate late arrivals. Attribution events delayed by minutes or hours are still included in the correct window if they arrive within the watermark tolerance period. For very late data (days), a separate correction pipeline can backfill using historical mode.
Q: Can I model multi-touch attribution in RisingWave? A: Basic last-touch and first-touch attribution models translate naturally to SQL aggregations. Multi-touch attribution requiring complex path analysis is better handled in a downstream analytical query, but you can pre-aggregate touch sequences in a materialized view as input.
Q: How do I compare ROAS across different attribution windows (D1, D3, D7)?
A: Define separate materialized views or filters using different time offsets in the join condition (INTERVAL '3 days', INTERVAL '7 days'). Each view maintains its own incremental state independently.
Q: Our UA spend data is in a separate system. Can we join it in RisingWave?
A: Yes. Ingest spend data from your campaign management system via a Kafka topic or PostgreSQL CDC source, then join it with attribution events using the campaign_id key in a materialized view.
Q: What is the minimum install volume needed before ROAS estimates are statistically meaningful? A: This is a statistics question independent of the streaming infrastructure — typically 100–500 installs per creative for directional signal. RisingWave can expose confidence intervals via SQL window functions if you add sample-size gating logic to the view.
Optimize UA Spend with Real-Time Intelligence
The gap between campaign launch and actionable ROAS data is pure budget risk. Streaming analytics with RisingWave compresses that gap from days to hours, letting UA teams make decisions while campaigns are still running.
- Deploy RisingWave and connect your first attribution stream: https://docs.risingwave.com/get-started
- Talk to the community about UA analytics patterns: https://risingwave.com/slack

