DEV Community

Cover image for Serverless Node.js URL Shortener App powered by Upstash Kafka and Materialize
Bobby Iliev
Bobby Iliev

Posted on • Updated on • Originally published at blog.upstash.com

Serverless Node.js URL Shortener App powered by Upstash Kafka and Materialize

This is a simple Node.js URL shortener app that uses Cloudflare Workers.

The app is powered by Cloudflare Workers and Upstash Redis for storing data and Kafka for storing the click events along with Materialize for real-time data analytics.

Upstash offers Serverless, Low latency, and pay-as-you-go solutions for Kafka and Redis.

Materialize is a streaming database for real-time applications. Materialize accepts input data from a variety of streaming sources (like Kafka), data stores and databases (like S3 and Postgres), and files (like CSV and JSON), and lets you query them using SQL.

App structure

The demo app has the following structure:

  • A serverless Cloudflare Worker that lets you add short links and redirect them to other URLs.
  • All data is stored in Upstash serverless Redis cluster as key-value pairs (short link -> long link).
  • Every time you visit a short link, it triggers an event and stores it in Upstash Kafka.
  • We then get the data from Upstash Kafka and analyze it in Materialize in real-time.

A demo of the app can be found here:

https://cf-url-shortener.bobbyiliev.workers.dev/admin

Diagram

The following is a diagram of the app structure:

Diagram of Upstash and Materialize demo

Demo

Here is a quick demo of how the app works:

mz-upstash-demo

Prerequisites

Before you get started, you need to make sure that you have the following

  • A Redis cluster and a Kafka cluster in Upstash.
  • A Kafka topic in Upstash called visits-log.
  • The Cloudflare CLI tool called wrangler on your local machine as described here
  • A Materialize instance running on your local machine as described here or a Materialize Cloud instance.

Running this demo

Once you have all the prerequisites, you can proceed with the following steps:

  • Clone the repository and run the following command:
git clone https://github.com/bobbyiliev/cf-url-shortener.git
Enter fullscreen mode Exit fullscreen mode
  • Access the directory:
cd cf-url-shortener
Enter fullscreen mode Exit fullscreen mode
  • Install the npm dependencies:
npm install
Enter fullscreen mode Exit fullscreen mode
  • Run the wrangler command to authenticate with Cloudflare:
wrangler login
Enter fullscreen mode Exit fullscreen mode
  • Then in the wrangler.toml file, update the account_id to match your Cloudflare account ID:
account_id = "YOUR_ACCOUNT_ID_HERE"
Enter fullscreen mode Exit fullscreen mode
  • Set the following secrets in Cloudflare using the wrangler tool:
wrangler secret put UPSTASH_REDIS_REST_URL
wrangler secret put UPSTASH_REDIS_REST_TOKEN
wrangler secret put UPSTASH_KAFKA_REST_URL
wrangler secret put UPSTASH_KAFKA_REST_USERNAME
wrangler secret put UPSTASH_KAFKA_REST_PASSWORD
Enter fullscreen mode Exit fullscreen mode

Make sure to use the REST API URLs and not the Broker details.

  • Run the following command to deploy the CF Worker:
wrangler deploy
Enter fullscreen mode Exit fullscreen mode

With the CF Worker deployed, you can visit the admin URL where you can add short links and redirect them to other URLs.

Setup Materialize

Once you've deployed the CF Worker, you can set up Materialize to analyze the data in Upstash Kafka in real-time.

Start by creating a new Materialize instance in Materialize Cloud:

Or alternatively, you can install Materialize locally:

After you've created the instance, you can connect to it using the psql command as shown in the docs.

Create a Kafka Source

The CREATE SOURCE statements allow you to connect Materialize to an external Kafka data source and lets you interact with its data as if the data were in a SQL table.

To create a new Kafka source in Materialize run the following statement:

CREATE SOURCE click_stats
  FROM KAFKA BROKER 'UPSTASH_KAFKA_BROKER_URL' TOPIC 'visits-log'
  WITH (
      security_protocol = 'SASL_SSL',
      sasl_mechanisms = 'SCRAM-SHA-256',
      sasl_username = 'UPSTASH_KAFKA_BROKER_USERNAME',
      sasl_password = 'UPSTASH_KAFKA_BROKER_PASSWORD'
  )
FORMAT BYTES;
Enter fullscreen mode Exit fullscreen mode

Change the Kafka details to match your Upstash Kafka cluster Broker and credentials.

Next, we will create a NON-materialized View, which you can think of as kind of a reusable template to be used in other materialized views:

CREATE VIEW click_stats_v AS
    SELECT
        *
    FROM (
        SELECT
            (data->>'shortCode')::string AS short_code,
            (data->>'longUrl')::string AS long_url,
            (data->>'country')::string AS country,
            (data->>'city')::string AS city,
            (data->>'ip')::string AS ip
        FROM (
            SELECT CAST(data AS jsonb) AS data
            FROM (
                SELECT convert_from(data, 'utf8') AS data
                FROM click_stats
            )
        )
    );
Enter fullscreen mode Exit fullscreen mode

Finally, create a materialized view to analyze the data in the Kafka source:

CREATE MATERIALIZED VIEW click_stats_m AS
    SELECT
        *
    FROM click_stats_v;
Enter fullscreen mode Exit fullscreen mode

Then you can query the materialized view just using standard SQL, but get the data in real-time, with sub-millisecond latency:

SELECT * FROM click_stats_m;
Enter fullscreen mode Exit fullscreen mode

You can stack up materialized views together, so let's order by the number of clicks per short link:

CREATE MATERIALIZED VIEW order_by_clicks AS
    SELECT
        short_code,
        COUNT(*) AS clicks
    FROM click_stats_m
    GROUP BY short_code;
Enter fullscreen mode Exit fullscreen mode

One of the great features of Materialize is TAIL.

TAIL streams updates from a source, table, or view as they occur.

So to stream the data from our materialized view using TAIL, we can use the following statement:

COPY ( TAIL ( SELECT * FROM order_by_clicks ) ) TO STDOUT;
Enter fullscreen mode Exit fullscreen mode

For more information about TAIL, check out this blog post:

Subscribe to changes in a view with TAIL in Materialize

Display the results in Metabase

As Materialize is Postgres-wire compatible, you can use BI tools like Metabase to create business intelligence dashboards using the real-time data streams in your Materialize instance.

For more information about Metabase + Materialize, check out the official documentation:

Metabase + Materialize

Example dashboard that shows the number of clicks per short link:

Materialize Metabase Dashboard

Conclusion

Using Materialize to analyze the data in your Upstash Kafka serverless instance is a great way to get real-time insights into your data.

As a next step, here are some other great resources to learn about Materialize and Upstash:

Top comments (4)

Collapse
 
rishabh570 profile image
Rishabh Rawat

Hey Bobby, it was a nice read. I had also recently worked on a URL shortener which was built with the similar stack.

It was fastify and MongoDB as a persistance layer. Kafka was used to send click events similar to the implementation here.

Interesting to see the implementation with Upstash. Kudos!

Collapse
 
bobbyiliev profile image
Bobby Iliev

Oh that is super cool! Mine is mainly a PoC. Do you have a link to yours? I would love to check it out if it is open-source 🙌

Collapse
 
rishabh570 profile image
Rishabh Rawat

Yep, it was an internal tool. I had open sourced it after abstracting away some of our business logic, it is available here. And here's the blog post.

One correction btw, we end up using PostgreSQL instead of MongoDB as we were already using it for similar purposes 😁.

Thread Thread
 
bobbyiliev profile image
Bobby Iliev

That is awesome! Thanks for sharing! I will be checking it out later on! Might try to extend this and add Materialize for real time data analytics 🚀