π Sync and Async Programming
πΆβπ«οΈ In modern world with so many complexities in Software development there are smart architectures to deal with it. For example if we build our application and later on we decide it to handle thousands or millions of users then we have to make smart decisions. We can take Micro-services approach in that.
π On the other hand there are some services that can make our system slow.
π Take an example of buying a flight βοΈ ticket online. When we will make our payment only then the booking will be successful, This is called Synchronous programming. On the other hand, Booking should not depend on the email like whether the Booking email delivered or not.
π§ So, Email and Booking are not related directly or Booking completion should not wait whether the email is sent or not. So, we can make email run in the background or Asynchronous.
π Refer below Image for the better clarification:-
- Sync Programming
- Async Programming using Message Queues
-> Here, we can see that Message Queues have been used in which Producers submit the events and different consumers do consume the event. Note:- This is just a simple example.
π Messages Queues
Message queues are used for service to service communication and gives us the async behaviour. It is mainly used in serverless and microservices architecture.
π€© There are various MQs out there in the market:-
- Apache Kafka
- Azure Scheduler
- Nastel
- Apache Qpid
- RabbitMQ
π We will see Kafka with NodeJS today so that you can use in your projects easily
** You can refer this Github Repo for the Reference**
π Implementation
π For the reference you can see:-
Kafka will run inside Docker container and Producer will add the events into the Kafka topics (Queue) and the consumer will consume.
- Docker Compose File
version: "3"
services:
zookeeper:
image: 'bitnami/zookeeper:latest'
ports:
- '2181:2181'
environment:
- ALLOW_ANONYMOUS_LOGIN=yes
kafka:
image: 'bitnami/kafka:latest'
container_name: 'kafka'
ports:
- '9092:9092'
environment:
- KAFKA_BROKER_ID=1
- KAFKA_LISTENERS=PLAINTEXT://:9092
- KAFKA_ADVERTISED_LISTENERS=PLAINTEXT://127.0.0.1:9092
- KAFKA_ZOOKEEPER_CONNECT=zookeeper:2181
- ALLOW_PLAINTEXT_LISTENER=yes
depends_on:
- zookeeper
βΊοΈ Here we are running the kafka image inside the docker container and have exposed the port 9092 for the outsiders. Zookeeper is the dependency that kafka need to run.
- eventType.js file
import avro from 'avsc';
export default avro.Type.forSchema({
type: 'record',
fields: [
{
name: 'category',
type: { type: 'enum', symbols: ['DOG', 'CAT'] }
},
{
name: 'noise',
type: 'string',
}
]
});
This is the schema that we are defining for the events to be submitted inside the kafka topics.
- Producer index.js
import Kafka from 'node-rdkafka';
import eventType from '../eventType.js';
const stream = Kafka.Producer.createWriteStream({
'metadata.broker.list': 'localhost:9092'
}, {}, {
topic: 'test'
});
stream.on('error', (err) => {
console.error('Error in our kafka stream');
console.error(err);
});
function queueRandomMessage() {
const category = getRandomAnimal();
const noise = getRandomNoise(category);
const event = { category, noise };
const success = stream.write(eventType.toBuffer(event));
if (success) {
console.log(`message queued (${JSON.stringify(event)})`);
} else {
console.log('Too many messages in the queue already..');
}
}
function getRandomAnimal() {
const categories = ['CAT', 'DOG'];
return categories[Math.floor(Math.random() * categories.length)];
}
function getRandomNoise(animal) {
if (animal === 'CAT') {
const noises = ['meow', 'purr'];
return noises[Math.floor(Math.random() * noises.length)];
} else if (animal === 'DOG') {
const noises = ['bark', 'woof'];
return noises[Math.floor(Math.random() * noises.length)];
} else {
return 'silence..';
}
}
setInterval(() => {
queueRandomMessage();
}, 3000);
Here, we are producing a random message every 3000ms and it is async because setInterval() is a method provided by Node API (Written in C++) which is async in nature.
- Consumer index.js
import Kafka from 'node-rdkafka';
import eventType from '../eventType.js';
var consumer = new Kafka.KafkaConsumer({
'group.id': 'kafka',
'metadata.broker.list': 'localhost:9092',
}, {});
consumer.connect();
consumer.on('ready', () => {
console.log('consumer ready..')
consumer.subscribe(['test']);
consumer.consume();
}).on('data', function(data) {
console.log(`received message: ${eventType.fromBuffer(data.value)}`);
});
Consumer will consume the test topic's messages and will print accordingly.
NOTE:- To run producer and consumer
- Run npm run start:producer
- Run npm run start:consumer
π₯³ That's it for today. Do Save and comment if you found any value. You can build your projects using this simple example as a reference.
Top comments (18)
Great articleπ Thanks
Thanks Buddy π
Helpful
Thanks bruh ππππ
Thanks a lot
Thanks βΊ
Great article, Made understanding kafka really simple
Thanks Romil, Really Appreciate π
Keep em coming
Yep bro ππβΊ
Amazing read!!
Thanks brother
Great work keep it up bro
Yes Bhavesh. Thanks for your time Brother
Great Lovepreet!
Thx bhiya πππ
Great content π
Thanks Buddy ππβΊ