How to implement Celery with Django

What is Celery?

Perhaps you might be wondering like me what the heck is Celery? Let’s begin by first understanding what Celery is. Celery is an open-source asynchronous task queue that is based on distributed message passing. And what is a task queue? A task queue’s input is a unit of work called a task. They are used as a mechanism to distribute work across threads or machines.

Celery communicates via messages using a broker such as Rabbitmq, or Redis. Broker mediates between clients and workers. A client initiates a task by adding a message to the queue then the broker delivers that message to the worker.

Why Celery?

We use Celery because of what it offers to the end users. People don’t have to wait for a transaction to complete but instead, Celery will take care of all the background tasks without you noticing the delays that the system might be experiencing. Here are a few reasons for using Celery:

  • It’s simple. Celery doesn’t need configuration files, and therefore, becomes easy to use and maintain.
  • It’s highly available, in case of failure it will automatically restart
  • It’s flexible, and every part of Celery can be extended or used on its own such as extending the serializer, longing, broker transports, etc.
  • It’s very fast and can process a million tasks within minutes.
  • It supports concurrency such as multithreading, single threading, or multiprocessing.
  • It supports serialization and brokers such as RabbitMQ, Redis, Amazon SQS

How to implement Celery with Django application

To begin implementing Celery in Django application. we first need to install Celery and RabbitMQ or Redis as brokers. So let’s begin.

In this example, I will be using Redis as a message broker and also as a database backend.

We are going to use the docker container to run our application because I love to do things inside a docker container.

To start building, create a skeleton project, and add Redis, celery, Django, pytest-django, pytest to the requirements.txt file.

django==4.1
celery==5.2.7
redis==4.4.0
gunicorn==20.1.0
whitenoise==6.2.0
pytest-django==4.5.2
black==22.12.0
pytest==7.2.0

The second thing is to create a Dockerfile and docker-compose file respectively. Add the following content to Dockerfile.

# Python base image. Get this from dockerhub
FROM python:3.11-slim

# Set environment variables
ENV PYTHONUNBUFFERED=1

# Set your working directory
WORKDIR /usr/src/app

# Install dependencies required by the image
RUN apt update && apt install -y g++ libpq-dev gcc musl-dev

# Allow docker to cache installed dependencies between builds 
COPY requirements.txt .
RUN python3 -m pip install -r requirements.txt --no-cache-dir

# Copy and mount the project to the working directory
COPY . .

# Script to run given instruction eg running production server.
CMD ["./run.sh"]

Also, add these contents to the docker-compose file

#docker-compose.yml
version: '3.9'

services:
  web:
    build: .
    command: python manage.py runserver 0.0.0.0:8000
    volumes:
      - .:/usr/src/app/
    ports:
      - 1337:8000
    environment:
      - DEBUG=1
      - SECRET_KEY=mySecretkey456
      - DJANGO_ALLOWED_HOSTS=localhost 127.0.0.1 [::1]
      - CELERY_BROKER=redis://redis:6379/0
      - CELERY_BACKEND=redis://redis:6379/0
    depends_on:
      - redis

  celery:
    build: .
    command: celery --app=core worker --loglevel=info --logfile=logs/celery.log
    volumes:
      - .:/usr/src/app
    environment:
      - DEBUG=1
      - SECRET_KEY=mySecretkey123
      - DJANGO_ALLOWED_HOSTS=localhost 127.0.0.1 [::1]
      - CELERY_BROKER=redis://redis:6379/0
      - CELERY_BACKEND=redis://redis:6379/0
    depends_on:
      - web
      - redis

  redis:
    image: redis:7-alpine

The third thing to do is to create a Django project, let’s call our project main. Use the following code

django-admin startapp main .

The fourth thin to do is to create a script that can run in order to collect static and run the Gunicorn server. create a run.sh and place the following code.

#!/bin/bash

python manage.py collectstatic --no-input

exec gunicorn --bind 0.0.0.0:8000 main.wsgi:application -w 2

The fifth thing to do is to create a Django app, called review

docker-compose run web /usr/local/bin/django-admin startapp review

You can add tests to our project, I am going to create pytest for this project. Create a tests folder in our app folder and add pytest, pytest-django to the requirements. Then create a pytest.ini file and add the following content.

[pytest]
DJANGO_SETTINGS_MODULE = main.settings

python_files = tests.py test_*.py *_tests.py

The last thing here is to power up our container image. To run use docker-compose up --build

docker-compose up --build

This will pull the Redis image and build our python 3.11 image. There is so much going on here. To learn more about Docker check this Docker 101 tutorial

Adding and configuring Celery inside Django

When the Django project setup is over, we can move ahead and add Celery-related settings to the Django project, this will create a connection between the Django app and the task queue.

So inside our Django project create a file and name it celery.py

import os

from celery import Celery


os.environ.setdefault("DJANGO_SETTINGS_MODULE", "main.settings")
app = Celery("main")
app.config_from_object("django.conf:settings", namespace="CELERY") 
app.autodiscover_tasks()// 
  • The first thing we did here is to let Celery know how to find the Django project with the help of DJANGO_SETTINGS_MODULE", "main.settings
  • The second thing is to create a Celery instance and instantiate it with an app.
  • The third thing is to make sure that we prevent any crush from happening with the use of namespace="CELERY"
  • The last thing is to tell celery to look for all celery-related tasks.

For celery.py to start fetching the necessary Celery settings, we need to tell Celery about Redis as a broker and as a db backend. Add the following code to the bottom of the settings.py.

CELERY_BROKER_URL = os.environ.get("CELERY_BROKER", "redis://redis:6379/0")
CELERY_RESULT_BACKEND = os.environ.get("CELERY_BROKER", "redis://redis:6379/0")

The next thing to update __init__.py in our main Django project. add the following code

from .celery import app as celery_app

__all__ = ('celery_app',)

This code will make sure the app is always imported when Django starts so that shared_task will use this app.

Now you can then power up your container and you will see that celery is up and running with a few warnings which I will optimize later.

Nextgentips: Celery status
Nextgentips: Celery status

That is it for the Celery configuration, in the next post I will show you how to handle workloads Asynchronously and start writing tests for our project. For now feel free to shoot any issue related to Celery, Redis, and also RabbitMQ. Get the code here

About Mason Kipward

I am a technology enthusiast who loves to share gained knowledge through offering daily tips as a way of empowering others. I am fan of Linux and all other things open source.
View all posts by Mason Kipward →

Leave a Reply

Your email address will not be published. Required fields are marked *