Move your data from Apache Pulsar to Snowflake, continuously

Continuously ingest data from different sources, transform data on-the-fly, and then deliver data to any destinations using RisingWave’s connectors.
Apache Pulsar
→
RisingWave
→
Snowflake
Apache Pulsar
↓-
RisingWave
↓-
Snowflake
Apache Pulsar
|
CREATE SOURCE IF NOT EXISTS orders_rw (
    order_id INTEGER PRIMARY KEY,
    customer_id INTEGER,
    order_status VARCHAR,
    total_amount DECIMAL,
    last_updated TIMESTAMP)
WITH (
   connector='pulsar',
   topic='demo_topic',
   service.url='pulsar://localhost:6650/',
   oauth.issuer.url='https://auth.streamnative.cloud/',
   oauth.credentials.url='s3://bucket_name/your_key_file.file',
   oauth.audience='urn:sn:pulsar:o-d6fgh:instance-0',
   aws.credentials.access_key_id='aws.credentials.access_key_id',
   aws.credentials.secret_access_key='aws.credentials.secret_access_key',
   scan.startup.mode='latest',
   scan.startup.timestamp.millis='140000000'
) FORMAT PLAIN ENCODE AVRO (
   message = 'message',
   schema.location = 'https://demo_bucket_name.s3-us-west-2.amazonaws.com/demo.avsc'
);
For comprehensive configuration details, please refer to the Pulsar connector documentation.
|
RisingWave
|
CREATE SINK snowflake_sink AS
SELECT
    order_status,
    COUNT(*) as order_count,
    SUM(total_amount) as total_revenue,
    AVG(total_amount) as avg_order_value,
    MIN(last_updated) as first_order_time,
    MAX(last_updated) as last_order_time
FROM orders_rw
WITH (
    connector = 'snowflake',
    type = 'append-only',
    s3.bucket_name = 'my_s3_bucket',
    s3.credentials.access = 'credentials_access',
    s3.credentials.secret = 'credientials_secret',
    s3.region_name = 'us-west-2',
    s3.path = 'data/uploads/',
    force_append_only = 'true'
);
For comprehensive configuration details, please refer to the Snowflake connector documentation.
|
Snowflake
The Modern Backbone for Your
Event-Driven Infrastructure
GitHubXLinkedInSlackYouTube
Sign up for our to stay updated.