Stream AWS EventBridge Events Directly to RisingWave

Stream AWS EventBridge Events Directly to RisingWave

Getting real-time insights from your AWS events often means setting up a pipeline with Kinesis or Kafka. We've built a simpler way. You can now stream events directly from Amazon EventBridge to a RisingWave webhook. This lets you analyze and act on events the moment they happen, using standard SQL. Here’s a quick guide on how to set it up and a few ideas for what you can build.

Why Go Direct?

The main advantage is cutting out the middleman. You get a simpler and cheaper pipeline without needing to manage Kinesis or Kafka. EventBridge also lets you filter events at the source, so you only send the data you actually need to RisingWave. This reduces noise and processing costs. Once the data is in, you can immediately use SQL to build live dashboards, alerts, and automated workflows.

What You Can Build

  • Real-Time Security Alerts: Watch for specific events like an IAM user creation and trigger an instant Slack alert.

  • Live S3 Dashboards: Track every new file in your data lake and build a live Grafana dashboard to monitor ingestion rates.

  • Automated Ops: When an EC2 instance fails, use its event to automatically create a detailed PagerDuty ticket.

How to Set It Up

Here’s a quick look at what it takes to get this running.

Step 1: Create a Webhook in RisingWave

First, create a table in RisingWave to receive the JSON events from EventBridge. To keep it secure, add a secret token for authorization.

CREATE TABLE eventbridge_events (
    payload JSONB
) WITH (
    connector = 'webhook'
);

To make sure only EventBridge can send data to this endpoint, you should add a secret token for authorization.

CREATE TABLE eventbridge_events (
    payload JSONB
) WITH (
    connector = 'webhook',
    webhook.authorization = 'secure_compare("headers->>''authorization''", "my-secret-token")'
);

Step 2: Configure the EventBridge Rule

In your AWS Console, go to Amazon EventBridge and:

  1. Create an API destination: Specify the RisingWave webhook URL. Set up an API Key for authorization with the name Authorization and your secret token as the value. The webhook URL should follow this format: https://<HOST>/webhook/<database>/<schema_name>/<table_name> , where <table_name> is the name of the table you created in RisingWave to receive data.

  2. Create a rule: Define an event pattern to filter for only the events you want (e.g., from aws.s3 in a specific bucket).

  3. Set the target: Point the rule to the API Destination you just created.

Now, any AWS event matching your rule will be sent directly to RisingWave.

Step 3: Create a Materialized View to Parse the JSON

The raw events arrive as JSON. Use a materialized view in RisingWave to parse them into structured columns you can easily query.

CREATE MATERIALIZED VIEW s3_uploads_summary AS
SELECT
    (payload->'detail'->>'bucket_name') AS bucket,
    (payload->>'time')::timestamptz AS event_time,
    (payload->'detail'->'object'->>'key') AS object_key,
    (payload->'detail'->'object'->>'size')::bigint AS size_bytes
FROM eventbridge_events
WHERE
    payload->>'source' = 'aws.s3' AND
    payload->>'detail-type' = 'Object Created';

This view stays constantly up-to-date as new events stream in. You can now connect it to dashboards or use it to trigger alerts.

Ready to Try It?

This direct integration is a simple way to build alerts and live dashboards on top of your AWS events, with less effort and fewer moving parts.

The Modern Backbone for Your
Data Streaming Workloads
GitHubXLinkedInSlackYouTube
Sign up for our to stay updated.