Let’s try to build an application using Node Js and Redis for caching, and also leverage the Redis pub sub-model.
Let’s try to build an application using Node Js and Redis which utilise caching, and also leverage the Redis pub-sub pattern.
So in this article, we will be exploring how you can utilise Redis as a caching system and also leverage the power of Publish/Subscribe messaging paradigm provided by Redis. Currently, we are using Node JS as our backend technology, but Redis is compatible with most of the programming languages.
Introduction
First things first let’s try to understand the overview of the entire system that we are planning to build. For better visual understanding I have drawn the architecture diagram that will help us to relate the link between different services. Also the role of Redis in the entire system.
If you aren’t aware of microservice-based architecture, I will recommend you read this article about Microservices Orchestration vs Choreography | (Technology). This article will give you a basic understanding of microservices. Moreover, it will tell you about the difference between microservices orchestration vs choreography.
Here for the demonstration purpose, we are planning to build two microservices using Node Js technology. Post that we will create a log-in API, in one of the services which will contain the user’s login details. The microservice one will check if the data is present in the Redis. if in case it is not able to find the data from the Redis it will go to the chosen database which would be an ultimate source of truth.
Once it finds the data, it will update the Redis, and in future, Redis will be our source of truth.
After that second microservice will come into play. It will try to fetch the user information by the ID by connecting to the same Redis instance. We will be comparing it by directly fetching the data from our database. Although Redis stands for Remote Dictionary Service and can be used as a database. But Redis database offering will be a topic for discussion in another article. Otherwise, this would just clutter the current article.
In the second step post the login logic, we would update the user information in one of the microservice and let the other microservice listen to it instead of polling. We will be leveraging the pub-sub paradigm provided by Redis. That will be the agenda of the entire tutorial.
Prerequisite for the tutorial.
Understanding of node, express and javascript. The node should be installed on your system.
Understanding of MongoDB. MongoDB should be Installed on your system.
Microsofts VS Code Editor. Although any editor can be used.
Table of Content
1. Setting Up Two microservices using Node JS.
2. Setting Up Mongo DB and connecting it with the microservices.
3. Setting Up Redis and connecting them with microservices.
4. Creating Login API in Microservice One.
5. Setting Up Redis Pub-Sub and utilising it in our microservices.
6. Redis Commander.
7. Wrap Up and References.
1. Setting Up Two Microservices using Node JS.
Let’s get started with the two node services. For setting up both the service we need to do the following steps.
Initialising project with npm init
Create an “index.js” file and add basic express code.
Installing the express package, using the command npm install express.
We can set up the node project by following the command, this will initialise
the node project using the following command.
> npm init
You will see the following command in your terminal, fill in the following detail, which you can see in the image below. Information like name, version, description, main, scripts, author, license etc. let's call this service backend-service-1.
Post that we will create a file index.js file. We can create a simple express server, by adding the following command. This service will start your express server on port 3000, as we have mentioned in our code.
**// index.js**
const express = require('express')
const app = express()
const port = 3000
app.get('/', (req, res) => {
res.send('Hello World Service 1! ')
})
app.listen(port, () => {
console.log(`Example app listening on port ${port}`)
});
Now we can create another service similar to backend-service-1, and name it as backend-service-2. Let us repeat a similar procedure for the backend-service-2 service. i.e.
Starting with npm init
Creating index.js file
Installing express using the command. npm install express
Add basic express code and run on port 3001. By updating the port variable. const port = 3001;
Post the creation of both the services your code structure should look something like the image below (Image 3). Just for context, the following is the expanded view of code structure for backend service 1. A similar structure you can expect for backend service 2.
You can run the code for each service, by opening the target location and using the following command
> node index.js
Post this you will see the following command on the terminal.
2. Setting Up Mongo DB and connecting it with the microservices.
Post the step of setting up microservices, we need to connect MongoDB with our application. For connecting MongoDB with our application we need to first install and run MongoDB on our system/server.
For installing and running up MongoDB you can follow this tutorial. Install MongoDB on OSX.
Generally, we set up the MongoDB cluster on multiple servers which helps in making it more fall-proof.
But for tutorial purposes, a single instance will work wonderfully. This tutorial is for OSX, but if you are using another OS like Windows or Ubuntu. You can search for the corresponding tutorial and continue with the installation.
You can try the following command, to check if mongo is installed, You will see something like the below.
> mongo
Once we are done with the installation and other stuff now we can connect mongo with our NodeJS Microservice, using the MongoDB npm package.
Install the npm package using the following command.
> npm i mongodb
Post installing the package, we need to add code for the mongo connection. By creating a file named mongo.js where we will be adding mongo code.
We need to add {type: “module”} in your package.json file which can be found at the root location, it will help you use es6 syntaxes.
Following is the code for connecting mongo DB with your app. Here you can see we have connected mongo DB, which is running on localhost:27017.
Mongo DB Default port: 27017
This is a global function which will create a connection with the DB. Using that connection we will make the subsequent Db query.
**//mongo.js**
import { MongoClient } from 'mongodb';
export async function connectToCluster() {
const uri = 'mongodb://localhost:27017';
let mongoClient;
try {
mongoClient = new MongoClient(uri);
console.log('Connecting to MongoDB');
await mongoClient.connect();
console.log('Successfully connected to MongoDB!');
return mongoClient;
} catch (error) {
console.error('Connection to MongoDB failed!', error);
process.exit();
}
}
A similar implementation we can do for backend-service-2, and set up mongo DB.
3. Setting Up Redis and connecting them with microservices.
After setting up node microservice and connecting it with MongoDB. We are ready to hope to our next step i.e. Installing Redis and connecting it with our microservice
For Mac OS we can use brew to install and run the Redis service on our project.
brew install redis
Redis Default Port: 6379
Now you can either start the Redis server with any of the following commands. Post-installation of course.
redis-server
Or you can start it as a brew service, which will help this service keep running in the background.
brew services start redis
Till now, we have done with the setup of Redis on our local system. We can do a similar procedure for setting Redis on the cloud (AWS, G Cloud, Azure etc.).
Now let’s connect Redis to our services. So the next step would be to install Redis npm packages onto your both services using the following command.
npm i redis
After installing the Redis package we need to create an instance of Redis which we will be using further.
//redis.js
import redis from 'redis';
export const client = redis.createClient();
client.on('connect', () => {
console.log('Connected to Redis Successfully.')
});
Here we are exporting a Redis instance, which we will be using to connect and further queries.
Now we are done, with the basic setup and in the next step, we will be connecting to the Redis server. After setting up a proper connection we will leverage caching onto our get request.
4. Creating Login API in Microservice One.
Now once we are done with Redis and Mongo DB connection and connected it with our application, now is the time to use the Redis caching with a simple login API.
Here in this API, we will create a simple login API, in which we will be passing the login username and password.
It is not recommended to pass passwords in plain text. But for tutorial purposes, we are doing this activity.
Let’s add the code for login API, post that we will go through the code. Following is the updated index.js file. It contains a login API, which includes making a call to our DB if in case we aren’t able to find the data in Redis in-memory DB.
// index.js
import express from 'express';
import { connectToCluster } from './mongo.js';
import { client } from './redis.js';
const app = express()
const port = 3000
let mongoClient = null;
app.get('/', async (req, res) => {
res.send('Hello World Service 1! ')
});
app.get('/login/:name/:password', async (req, res) => {
try {
const { name, password } = req.params;
// check if pass exist in redis.
const tempPassword = await client.get(name);
if (tempPassword) {
if (tempPassword === password) {
res.send('Login Success from Redis.');
}
}
else {
// Mongo DB Query
mongoClient = await connectToCluster();
const dbName = 'test';
const db = mongoClient.db(dbName);
const collection = db.collection('user');
const findResult = await collection.findOne({ name, password })
if (findResult) {
await client.set(name, password);
res.send('Login Success from DB.');
} else {
res.send('Login Failed.');
}
}
} catch (err) {
console.log(err)
res.send('Login Failed.');
}
finally {
await mongoClient.close();
}
});
app.listen(port, async () => {
client.connect();
console.log(`Example app listening on port ${port}`)
});
Explanation of the Above code.
app.get('/login/:name/:password'
This is the API endpoint that we have defined, it is expecting the user to pass the name and password as a parameter.
const tempPassword = await client.get(name);
if (tempPassword) {
if (tempPassword === password) {
res.send('Login Success from Redis.');
}
}
Here we are checking we have some entries in the Redis with the key as a name.
In Redis, we can store various data types, ranging from normal string to hash map. We have store data as simple key and value. In which key is the name of the user and value is the password.
Again emphasises the fact that for production we don’t store passwords in plain text, this is just for making us understand the Redis.
Moving forward we are checking if we are getting the data in Redis, we simply compare the password. If we didn’t get any key matching the name passed, we move forward to our DB call.
else {
// Mongo DB Query
mongoClient = await connectToCluster();
const dbName = 'test';
const db = mongoClient.db(dbName);
const collection = db.collection('user');
const findResult = await collection.findOne({ name, password })
if (findResult) {
await client.set(name, password);
res.send('Login Success from DB.');
} else {
res.send('Login Failed.');
}
}
Here is the simple representation of the else, in case we were not able to find the name in the Redis store. In our else block, once we are able to find a successful login we added the username and password to our Redis in-memory cache. Using the following code.
if (findResult) {
await client.set(name, password);
res.send('Login Success from DB.');
} else {
res.send('Login Failed.');
}
We will make an API as shown below, it will expect a user name. Once we receive the username, we are publishing it to the other services on a channel named ‘user’, there would be many requests which will be returning the same data, and caching seems to be a good option for that.
5. Setting Up Redis Pub-Sub and utilising it in our microservices.
For utilising the pub-sub paradigm, we are creating an API to update user information in backend-service-1. And another service i.e. backend-service-2 will be listening to the changes by subscribing to the channel.
We will make an API as shown below, it will expect a user name. Once we receive the username, we are publishing it to the other services on a channel named ‘user’. Basically, this message is broadcasted from our backend service 1. Now anyone who wants to listen to the change can simply subscribe to the channel and can utilise the message broadcast.
app.get('/updateName/:name', async (req, res) => {
const { name } = req.params;
client.publish('user', name);
res.send(`Name: ${name} is been published`)
});
We have subscribed to the channel named as ‘user’ in our backend service 2. As explained earlier as soon as someone will hit the update name API on backend service 1, the message will be published to the user channel. Since backend service 2 is subscribed to the same channel it will receive the message transmitted i.e. name to be updated.
Now service 2 has the freedom to play around with the data.
app.listen(port, async () => {
client.connect();
await client.subscribe('user', (message) => {
console.log(message); // 'message'
});
console.log(`Example app listening on port ${port}`)
});
Similarly, we can have our various use cases solved using the pub-sub paradigm.
6. Redis Commander.
As a bonus tip, we have a tool named Redis commander. The Redis commander is a User Interface that rewards you with a visualisation of your data in Redis.
To install the tool you can use the following command.
> npm i -g redis-commander
// Post installation to run redis commander, use this command.
> redis-commander
A User interface like the below will be rendered. This makes searching and analysing very easy.
Reference: https://www.npmjs.com/package/redis-commander
7. Wrap Up and References.
In this tutorial, we have explored about Redis caching mechanism and pub-sub paradigm. These features provided by Redis can be used for various purposes. These features can help your application to scale well.
Apart from another benefit, the data is in memory, and readily available. So the latency is quite less as compared to typical databases.
There are various additional features provided by Redis which can be used, like using Redis as full fledge database, data persistence etc. But will cover that topic in upcoming tutorials.
Meanwhile, read more about Redis from their official website(Redis)
Other References.
Watch this video on the benefits of Redis Cloud over other Redis providers
Embed this video if possible
Redis Developer Hub — tools, guides, and tutorials about Redis
This post is in collaboration with Redis.
About The Author
Apoorv Tomar is a software developer and part of Mindroast. You can connect with him on Twitter, Linkedin, Telegram and Instagram. Subscribe to the newsletter for the latest curated content. Don’t hesitate to say ‘Hi’ on any platform, just stating a reference of where did you find my profile.
Top comments (0)