What is *Asynchronous Processing?*
Web applications undoubtedly have a great deal of code that executes as part of the HTTP request/response cycle. This is suitable for faster tasks that can be done within hundreds of milliseconds or less.
However, any processing that would take more than a second or two will ultimately be far too slow for synchronous execution. In addition, there is often processing that needs to be scheduled in the future and/or processing that needs to affect an external service.
In these cases when we have a task that needs to execute but that is not a candidate for synchronous processing, the best course of action is to move the execution outside the request/response cycle.
Specifically, we can have the synchronous web app simply notify another separate program that certain processing needs to be done at a later time.
Now, instead of the task running as a part of the actual web response, the processing runs separately so that the web application can respond quickly to the request.
Here comes the need for Task Queue
-
What are task queue/message queues?
- Task queues manage background work that must be executed outside the usual HTTP request-response cycle.
- Tasks are handled asynchronously either because they are not initiated by an HTTP request or because they are long-running jobs that would dramatically reduce the performance of an HTTP response.
- At the simplest level, a task queue/message queue is a way for applications and discrete components to send messages to one another in order to reliably communicate.
- Message queues are typically (but not always) ‘brokers’ that facilitate message passing by providing a protocol or interface that other services can access.
- This interface connects producers **which create messages and the consumers **which then process them.
Within the context of a web application, one common case is that the producer
is a client application (i.e Rails or Sinatra) that creates messages based on interactions from the user (i.e user signing up).
The consumer in that case is typically daemon processes (i.e rake tasks) that can then process the arriving messages.
Asynchronous task queues are tools to allow pieces of a software program to run in a separate machine/process.
It is often used in web architectures as a way to delegate long-lasting tasks while quickly answering requests.
The delegated task can trigger an action such as sending an email to the user or simply updating data internally in the system when it finishes executing.
Task queues are used as a mechanism to distribute work across threads or machines.
A task queue’s input is a unit of work called a task
Dedicated worker processes constantly monitor task queues for new work to perform.
Celery communicates via messages, usually using a broker to mediate between clients and workers.
To initiate a task the client adds a message to the queue, the broker then delivers that message to a worker.
Need for celery?
Offloading work from your app to distributed processes that can run independently of your app.
Scheduling task execution at a specific time, sometimes as recurring events.
What is celery?
As per definition, Celery is a powerful, production-ready asynchronous job queue, which allows you to run time-consuming Python functions in the background. A Celery powered application can respond to user requests quickly, while long-running tasks are passed onto the queue. In this article we will demonstrate how to add Celery to a Django application using Redis.
Celery uses a *broker *to pass messages between your application and Celery worker processes.
Celery aims to provide a quick interface for sending messages to its distributed task queue.
Celery-Django Workflow:
Celery decreases performance load by running part of the functionality as postponed tasks either on the same server as other tasks or on a different server.
Basic functionalities of Celery:
Define tasks as python functions.
Listen to a message broker for new tasks.
Assign the tasks to workers.
Monitor the workers and tasks.
The architecture of Celery:
The internal working of Celery can easily be stated as the Producer/Consumer model. Producers place the jobs in a queue, and consumers are ready for them from the queue. Given this, at high-level Celery has 3 main components:
1] **Producers:**
Producers are commonly the ‘web nodes’, the web service process, handling the web request. During the request processing, tasks are delegated to Celery i.e. pushed into the task queue.
2] **Queue:**
Queue is a broker, which basically helps passing tasks from web applications to Celery worker(s). Celery has full support for RabbitMQ and Redis, and also supports Amazon SQS and Zookeeper but with limited capabilities
3] **Consumers:**
Consumers are ‘worker nodes’, listening to queue head, whenever a task is published, they consume and execute it. Workers can also publish back to the queue, triggering another tasks, hence they can also behave as producers.
Main points:
Using celery you can assign a task to some worker and continue to your routine.
You can put everything i.e is taking time out of request response cycle.
Can be used for,
Sending emails.
Sending push notifications.
Resizing and editing images.
Taking backups.
Data sync duties.
Top comments (0)