Using Redis as a Celery back end

Your Django applications will suffer a performance drop if they have time-intensive tasks as part of their workflow. Using asynchronous code with Celery, not the vegetable, you can offload time-intensive tasks from your Python app. This way, your app can continue to respond quickly to users while Celery handles the functions in the background.

However, you need a message broker and a database back end to keep track of the tasks and the results. This will allow all three components of your asynchronous workflow to talk to one another seamlessly.

Using Celery For Asynchronous Tasks In Django

Celery is a distributed task queue often made for UNIX systems. It unlocks a worker process for your Django app. You can offload tasks from the main req/res cycle within Django. Using Celery can help you get better performance out of your Django app while allowing it to scale better. You can also use Celery for repeatable scheduled tasks.

A task queue is a mechanism to distribute work across threads or machines. The input to the queue is a unit of work called a task. Dedicated worker processors monitor task queues for new work to pick up.

The executions on Celery are driven by the information in messages produced by the client (your Django app). It, therefore, needs a broker to mediate between clients and workers. To initiate the execution, a client adds a message to the queue, which the broker delivers to the worker.

Why Redis as key – value storage?

We have established that a Celery system needs a broker and database back end to communicate with the client. Which is a better back end than an in-memory data store, one that can have disk persistence in cases of system shutdowns?

Redis is an open-source data store used by millions of developers as a database, streaming engine, cache, message broker, or data store backend. It supports an advanced key-value store which can contain strings, lists, sets, sorted sets, or hashes.

Redis can serve as both your message broker and database back end at the same time. Apart from storing messages produced by the application describing the work to be done in the Celery task queue, Redis is also a storage for the results that come off the celery task, which can be retrieved and consumed by the client.

Where do you use Django, Celery with Redis

These three tech stacks work together to make asynchronous magic in Django. A few significant use cases include:

  • Sending emails and/or email notifications
  • Generating any reports that take 3+ seconds to create
  • Running specific functions on a schedule
  • Run machine learning inference or training
  • Offloading any long-running tasks
  • Backing up a database
  • Powering up/down additional virtual machines for load handling
  • Triggering workflows
  • Sending webhook notifications

Conclusion

Asynchronously handling expensive computations and time-intensive tasks in the background can improve your Django app’s performance. Using Celery, not just a message broker but also as the database back end to store the results of the tasks, is a good option.