The rise of real-time event-driven applications has led to the development of modern technology stacks that can handle large volumes of events in real time. Real-time event-driven applications are characterized by their ability to respond immediately to events as they occur, providing users with up-to-date information and faster feedback.
To build a modern stack for a real-time event-driven application, you need to consider several components and technologies that can handle the different stages of event processing, from event collection to user interface. In this article, we will discuss the various components of a modern stack in a real-time event-driven sample online application architecture that provides discount information from different markets in a city to customers.
Learning objectives
By the end of this article, you will learn:
- Understand the limitations of continuous polling data from a server.
- Build event-driven architecture for a discount web application.
- How to show data to users in real-time.
Understanding the app use cases and requirements
A website or mobile application that provides discount information from different markets in a city to customers can be a useful tool for shoppers who are looking to save money on their purchases. The app can provide real-time information about discounts and deals in various markets, allowing users to quickly and easily find the best deals.
Below picture shows when we open the website on a mobile app:
Customers can use this website to compare prices at different markets and receive notifications about discounts and deals at markets that are close to their current location as soon as those markets create real-time entries to the system. They can use the app to plan their shopping trips and budget their purchases based on the available discounts and deals.
Some core technical requirements for this app may include: It should provide real-time information about discounts and deals at various markets, ensure that customers have access to the most up-to-date information, and filter out this data based on the user’s location. Maybe it also should allow users to search for discounts and deals based on specific products or categories. Implement user authentication to provide personalized deals information based on the user's preferences.
Here is an important question arises: how do we show real-time discount information to users (while they are using the website) as this data appears in city markets? Without diving into many details in this website architecture, let’s only focus on finding a solution for the technical requirement of retrieving discount data from the server in real-time.
Evaluate the first typical solution
The first possible straightforward architecture reports discount information by fetching changes from the server based on a timer and shows them on a single-page frontend app. The page uses a timer to send a request to the server every three seconds to request discounts. The response returns an array of discounts, which are then displayed to the user.
This design is known as a timer-based polling approach. The sample client might use JavaScript libraries such as ReactJS to compose the website UI and the Axios HTTP client to handle requests to a backend API endpoint (The backend can be built using REST and any back-end framework such as NodeJS or any low-code framework).
The discount information is regularly updated by another integrated API that faces external calls from different markets (It can use a webhook endpoint to be notified when new market discounts get available by providing a call-back endpoint) and then it stores on the server in a relational database (MySQL, PostgreSQL). When our backend discount API is triggered by an HTTP request from the client website, it returns content from the database.
Limitations of the current solution
The actual issue lies in not how fast we ingest market discount data from data providers, but in how fast we deliver it to UI (Assuming that data is already stored in our database, we analyze and process this data). Let's think about some of the weaknesses of this approach.
Between the discount frontend app and backend service, there are constant poll requests for changes. Because it is timer-based, the client application contacts the server whether or not changes exist to the underlying data. Once data is returned from the server, the entire list of discounts is updated on the web page - regardless of any changes in the data. This is highly inefficient, and calls may result in empty payloads when there haven't been any updates in the database. Also, what if the called HTTP API accepts our HTTP request but takes a long time to handle it, this could affect the user experience, especially when the behavior is reflected in the user interface (meaning a user has to refresh a page to get the latest changes on discount).
Once you're more familiar with the first web app architecture and its limitations, it's time to introduce a new design to solve the above issues.
Real-Time Event-Driven Data Exchange
Event-Driven Architecture (EDA) seems an ideal fit to achieve the above technical requirements. In an event-driven architecture, components are designed to react to events as they occur, rather than being invoked by other components or services. Exactly what we want, instead of having to constantly poll for changes from the backend, it uses a push-based approach and lets the backend server send a message or notification to all connected clients automatically.
Here, Real-time refers to the ability of our system to respond to events or input immediately, or within a very short period of time. In the context of the discount web app, it means processing discount events as soon as they occur, without any significant delay.
The below diagram shows a new architecture of components in the discount app. The architecture shows 4 main stages starting from how we detect discount data change, ingest and propagate them to event consumers and show them on the UI. Basically, it is a reverse flow to the first solution’s design.
Let’s break down each component and understand their roles in the next section.
New architecture breakdown
This new design adds real-time features to our discount data, reduces traffic, and makes a more efficient UI by only updating as data changes. But it leverages a few open-source technologies and tools for event streaming.
The first component is a database that acts as a data source, which can be PostgreSQL (Other popular options include MongoDB or MySQL). As data changes in the database, a change is detected using the Log-based CDC (Change Data Capture) capabilities of the database. It captures the change and records it in a transaction log. The captured changes are then transformed into a change event that can be consumed in real-time by downstream systems (a message broker) such as Kafka.
We will use the Debezium connector for Postgres to extract these CDC logs in the form of event streams from the database to Kafka. Once our discount events come into Kafka, a streaming database such as RisingWave can subscribe to this change feed from Kafka topics. Then, RisingWave reads and stores them in its local persistent storage in the form of materialized views to achieve fault tolerance. In other words, it can act as a stateful stream processor that can materialize the CDC events stream into relational database tables that represent the current state. The streaming database helps us quickly build a materialized view in real time that shows all discount information based on user-specified matching criteria, or prepare statistics like the TOP 5 deals on a chosen product or find out the closest deals from results of different markets. Additionally, it gives us the ability to analyze the data by delivering them to BI and data analytics platforms for making better business decisions based on our web application usage.
RisingWave is a streaming database specialized in real-time analytics that can read directly database change events from Postgres binlogs or Kafka topics and build a materialized view by joining multiple events together. RisingWave will keep the view up to date as new events come and allow you to query using SQL to have access to the latest changes made to the data.
The streaming database writes back results to Kafka topics by using sink operation in real-time. Now, we need to add JavaScript code to consume and process discount messages received from Kafka and update the UI in real-time to display them. We can use the KafkaJS library, which is a popular Kafka client for Node.js to listen and consume Kafka messages.
Now, you can use this Kafka consumer function in a React Native component to update the UI as new discount data is received.
Conclusion
In this post, we learned how to design architecture for a web app that provides real-time discount information from different markets. The web app can help users find the best deals and make informed shopping decisions. We evaluated two different solutions that implement time-triggered polling and use even-driven patterns.
We understood that building a modern stack for a real-time event-driven application requires a range of technologies. One can use a combination of streaming processors, the streaming database, modern JavaScript frameworks, and libraries. Streaming database deals with complex queries that often change by pre-computing the result in its cache. In the second architecture, we did not introduce any additional service/microservice that implements a custom stream processing logic that would result in increased operational overhead and complexity of deployments.
Related resources
- How to choose the right streaming database
- Issue SQL queries to manage your data
- Query Real-Time Data in Kafka Using SQL
- How Change Data Capture (CDC) Works with Streaming Database
Recommended content
- Build Your First Clickstream Analytics Dashboard: An End-to-End Guide
- Why you should consider Event-Driven Serverless Architectures
Community
🙋 Join the Risingwave Community
About the author
Visit my personal blog: www.iambobur.com
Top comments (0)