Mobile Game Analytics Pipeline with Streaming SQL

Mobile Game Analytics Pipeline with Streaming SQL

A mobile game analytics pipeline built on streaming SQL uses RisingWave to continuously process player install events, session data, in-app purchases, and ad engagement as they happen. Instead of waiting for nightly ETL jobs, product managers and growth teams see live DAU, ARPU, and retention metrics that reflect the last few seconds of activity.

Why Mobile Game Analytics Needs to Be Real-Time

Mobile games compete in an attention economy where a player's interest peaks in the first three sessions after install. Studios that discover a broken onboarding flow in next morning's batch report have already lost thousands of players. The app store algorithm reacts to install velocity, ratings, and session length within hours — not days.

Beyond acquisition, live operations decisions depend on current data: when to push a targeted offer, when to adjust ad frequency, when to flag a version rollout that is crashing on a specific device. Streaming SQL closes the gap between what players are doing and what the studio knows, turning analytics from a retrospective report into a live decision-support layer.

Ingesting Mobile Events

Mobile analytics events arrive via a server-side collection pipeline (Firebase, Amplitude forwarding, or a custom SDK) into Kafka. Define the source in RisingWave:

CREATE SOURCE mobile_events (
    event_id        VARCHAR,
    player_id       VARCHAR,
    device_id       VARCHAR,
    platform        VARCHAR,
    app_version     VARCHAR,
    country         VARCHAR,
    channel         VARCHAR,
    event_type      VARCHAR,
    level            INT,
    session_id      VARCHAR,
    revenue_usd     DECIMAL(10,4),
    ad_unit_id      VARCHAR,
    event_time      TIMESTAMPTZ
)
WITH (
    connector     = 'kafka',
    topic         = 'mobile.game.events',
    properties.bootstrap.server = 'kafka:9092',
    scan.startup.mode = 'latest'
)
FORMAT PLAIN ENCODE JSON;

Live DAU and Session Metrics

Daily active users and average session length are the two most-watched metrics in mobile gaming. Maintain them as continuously updated materialized views:

CREATE MATERIALIZED VIEW mobile_dau_live AS
SELECT
    window_start,
    window_end,
    platform,
    country,
    app_version,
    COUNT(DISTINCT player_id)                                       AS dau,
    COUNT(DISTINCT session_id)                                      AS total_sessions,
    COUNT(*) FILTER (WHERE event_type = 'install')                  AS installs,
    COUNT(DISTINCT player_id) FILTER (WHERE event_type = 'install') AS new_players,
    SUM(revenue_usd)                                                AS revenue_usd,
    ROUND(
        SUM(revenue_usd) / NULLIF(COUNT(DISTINCT player_id), 0), 4
    )                                                               AS arpu_usd
FROM TUMBLE(mobile_events, event_time, INTERVAL '1 hour')
GROUP BY window_start, window_end, platform, country, app_version;

Product managers query this view directly via Metabase or Grafana over the PostgreSQL interface. The numbers reflect events from the past few seconds, so hour-over-hour comparisons are accurate and timely.

Funnel Analysis with Windowed Aggregations

Onboarding funnel drop-off is the single biggest lever for mobile game growth. Build a live funnel view:

CREATE MATERIALIZED VIEW onboarding_funnel AS
SELECT
    window_start,
    window_end,
    channel,
    country,
    app_version,
    COUNT(DISTINCT player_id) FILTER (WHERE event_type = 'install')       AS step_install,
    COUNT(DISTINCT player_id) FILTER (WHERE event_type = 'tutorial_start') AS step_tutorial_start,
    COUNT(DISTINCT player_id) FILTER (WHERE event_type = 'tutorial_complete') AS step_tutorial_complete,
    COUNT(DISTINCT player_id) FILTER (WHERE event_type = 'first_purchase') AS step_first_purchase,
    ROUND(
        COUNT(DISTINCT player_id) FILTER (WHERE event_type = 'tutorial_complete')::DECIMAL /
        NULLIF(COUNT(DISTINCT player_id) FILTER (WHERE event_type = 'install'), 0) * 100, 2
    )                                                                      AS tutorial_completion_pct,
    ROUND(
        COUNT(DISTINCT player_id) FILTER (WHERE event_type = 'first_purchase')::DECIMAL /
        NULLIF(COUNT(DISTINCT player_id) FILTER (WHERE event_type = 'install'), 0) * 100, 2
    )                                                                      AS install_to_purchase_pct
FROM TUMBLE(mobile_events, event_time, INTERVAL '1 day')
GROUP BY window_start, window_end, channel, country, app_version;

When tutorial_completion_pct drops below your baseline after an app update, the streaming pipeline surfaces it within the current day — not the next morning.

Monetization Monitoring with Revenue Anomaly Detection

Track real-time revenue and detect anomalies (payment gateway outages, pricing errors) before they compound:

CREATE MATERIALIZED VIEW revenue_monitor AS
SELECT
    window_start,
    window_end,
    platform,
    country,
    COUNT(*) FILTER (WHERE event_type = 'purchase')             AS purchase_count,
    SUM(revenue_usd) FILTER (WHERE event_type = 'purchase')     AS gross_revenue_usd,
    AVG(revenue_usd) FILTER (WHERE event_type = 'purchase')     AS avg_order_value,
    COUNT(*) FILTER (WHERE event_type = 'purchase' AND revenue_usd = 0) AS zero_revenue_purchases,
    MAX(revenue_usd)                                             AS max_single_purchase
FROM TUMBLE(mobile_events, event_time, INTERVAL '15 minutes')
WHERE event_type = 'purchase'
GROUP BY window_start, window_end, platform, country;

A spike in zero_revenue_purchases alongside normal purchase_count indicates a payment gateway fault recording transactions but not collecting payment.

Comparison: Mobile Analytics Pipeline Approaches

ApproachData FreshnessCostAnalyst Self-ServiceReal-Time Alerting
Firebase Analytics24-48 hoursFree/lowMediumNo
Amplitude / MixpanelHoursHighHighLimited
Custom Kafka + SparkMinutesHighLowCustom
RisingWave streaming SQLSub-secondLowHigh (SQL)Yes

Sinking Live Metrics to a Dashboard Database

Push live mobile analytics to a PostgreSQL instance powering your BI tool:

CREATE SINK mobile_metrics_sink
FROM mobile_dau_live
WITH (
    connector = 'jdbc',
    jdbc.url = 'jdbc:postgresql://bi-db:5432/analytics',
    table.name = 'mobile_dau_hourly',
    type = 'upsert',
    primary_key = 'window_start,platform,country,app_version'
);

Superset, Metabase, or Grafana connect to bi-db and render charts that update automatically as the sink writes new rows.

Attribution and Channel Performance

Understanding which acquisition channels produce the most valuable players drives UA budget allocation. Build a channel cohort view:

CREATE MATERIALIZED VIEW channel_cohort_performance AS
SELECT
    window_start,
    window_end,
    channel,
    country,
    COUNT(DISTINCT player_id) FILTER (WHERE event_type = 'install')         AS installs,
    SUM(revenue_usd)                                                         AS total_revenue,
    COUNT(DISTINCT player_id) FILTER (WHERE revenue_usd > 0)                AS paying_players,
    ROUND(
        SUM(revenue_usd) / NULLIF(COUNT(DISTINCT player_id) FILTER (WHERE event_type = 'install'), 0), 4
    )                                                                        AS ltv_estimate_usd
FROM TUMBLE(mobile_events, event_time, INTERVAL '1 day')
GROUP BY window_start, window_end, channel, country;

UA managers query this view daily to see which channels are generating the highest LTV players and reallocate budgets accordingly.

FAQ

Q: How do I handle event deduplication when mobile clients retry failed submissions? A: Include event_id in your Kafka messages and use RisingWave's DISTINCT filtering in materialized views to deduplicate on event_id. For sink targets, use upsert mode keyed on event_id.

Q: Can RisingWave ingest events directly from Firebase or Amplitude? A: Not directly. Use Firebase's BigQuery export or Amplitude's data export to forward events to Kafka, then connect RisingWave to the Kafka topic. Several ETL tools (Airbyte, Fivetran) support this pipeline.

Q: How do I measure D1, D7, and D30 retention in a streaming database? A: For retention cohorts, combine the streaming install events with a RisingWave table that tracks the install date per player. Join session events against that table to count returning players within each retention window.

Q: What happens if my event collection pipeline has an outage? A: Kafka retains events until they are consumed. RisingWave replays from the last committed offset on restart, so no data is lost. The materialized views catch up automatically once the pipeline resumes.

Q: Is RisingWave suitable for a hyper-casual game with very high install volume? A: Yes. Hyper-casual games often generate 100,000+ installs per day with simple event schemas — an ideal fit for RisingWave's high-throughput ingestion and lightweight aggregation.

Build an Analytics Pipeline That Keeps Pace With Your Players

Mobile games move fast. With RisingWave, your analytics pipeline moves just as fast — so every decision is based on what's happening right now, not what happened yesterday.

Start building at https://docs.risingwave.com/get-started and join the mobile gaming analytics community at https://risingwave.com/slack.

Best-in-Class Event Streaming
for Agents, Apps, and Analytics
GitHubXLinkedInSlackYouTube
Sign up for our to stay updated.