|
| 1 | +# Celery and Redis |
| 2 | + |
| 3 | +Let's create a new branch called `feature-celery` where we will add Celery to our application. Adding Celery and Redis will allow us to process tasks asynchronously. |
| 4 | + |
| 5 | +## Adding Celery and Redis |
| 6 | +To add celery to our project we will need to do the following: |
| 7 | + |
| 8 | +- Add `celery` and `redis` services to our docker-compose files |
| 9 | +- Add `celery` and `redis` to our `requirements.txt` |
| 10 | +- Add `celery` settings in `settings.py` |
| 11 | +- Add `celery_app.py` to our Django application |
| 12 | +- Test `celery` and `redis` with a sample task |
| 13 | + |
| 14 | +### Docker Compose |
| 15 | + |
| 16 | +Add the following to both `docker-compose.yml` and `docker-compose.dev.yml`: |
| 17 | + |
| 18 | +```yml |
| 19 | + redis: |
| 20 | + image: redis:alpine |
| 21 | + container_name: redis |
| 22 | + networks: |
| 23 | + - main |
| 24 | + |
| 25 | + celery: |
| 26 | + build: ./backend |
| 27 | + container_name: celery |
| 28 | + command: bash -c 'celery worker --app=backend.celery_app:app --loglevel=info' |
| 29 | + volumes: |
| 30 | + - ./backend:/code |
| 31 | + depends_on: |
| 32 | + - db |
| 33 | + - redis |
| 34 | + networks: |
| 35 | + - main |
| 36 | +``` |
| 37 | +
|
| 38 | +Now add the following to `requirements.txt`: |
| 39 | + |
| 40 | +``` |
| 41 | +celery==4.2 |
| 42 | +redis==2.10.5 |
| 43 | +``` |
| 44 | +
|
| 45 | +Add the following to our Django settings (`settings.py`): |
| 46 | +
|
| 47 | +```python |
| 48 | +# Celery Configuration |
| 49 | +
|
| 50 | +CELERY_BROKER_URL = 'redis://redis:6379' |
| 51 | +CELERY_RESULT_BACKEND = 'redis://redis:6379' |
| 52 | +CELERY_ACCEPT_CONTENT = ['application/json'] |
| 53 | +CELERY_TASK_SERIALIZER = 'json' |
| 54 | +CELERY_RESULT_SERIALIZER = 'json' |
| 55 | +``` |
| 56 | + |
| 57 | +### Define our Celery App |
| 58 | + |
| 59 | +Now add `celery_app.py` next to `settings.py` in Django: |
| 60 | + |
| 61 | +```python |
| 62 | +import os |
| 63 | +from celery import Celery |
| 64 | +import time |
| 65 | + |
| 66 | +os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'backend.settings') |
| 67 | +app = Celery('backend') |
| 68 | +app.config_from_object('django.conf:settings', namespace='CELERY') |
| 69 | +app.autodiscover_tasks() |
| 70 | + |
| 71 | + |
| 72 | +@app.task(bind=True) |
| 73 | +def debug_task(self): |
| 74 | + print("Doing async task") |
| 75 | + time.sleep(2) |
| 76 | + print("Task is done") |
| 77 | +``` |
| 78 | + |
| 79 | +Additionally, we need to add two lines of code to `backend/__init__.py` that will allow us to register Celery tasks in all of our Django apps: |
| 80 | + |
| 81 | +**backend/backend/__init__.py** |
| 82 | + |
| 83 | +```python |
| 84 | +# This will make sure the app is always imported when |
| 85 | +# Django starts so that shared_task will use this app. |
| 86 | +from .celery import app as celery_app |
| 87 | + |
| 88 | +__all__ = ('celery_app',) |
| 89 | +``` |
| 90 | + |
| 91 | +### Sample Task |
| 92 | + |
| 93 | +Let's test that this sample task is processed in our celery worker. In the `posts` app, let's add a function and map it to a url pattern. We will call the task inside the function body: |
| 94 | + |
| 95 | +**backend/posts/views.py** |
| 96 | + |
| 97 | +```python |
| 98 | +from backend.celery_app import debug_task |
| 99 | + |
| 100 | +from rest_framework import generics |
| 101 | +from rest_framework.decorators import api_view, authentication_classes, permission_classes |
| 102 | +from rest_framework.response import Response |
| 103 | + |
| 104 | +... |
| 105 | + |
| 106 | +@api_view() |
| 107 | +@authentication_classes([]) |
| 108 | +@permission_classes([]) |
| 109 | +def celery_test_view(request): |
| 110 | + debug_task.delay() |
| 111 | + return Response({"message": "Your task is being processed!"}) |
| 112 | +``` |
| 113 | + |
| 114 | +**backend/posts/urls.py** |
| 115 | + |
| 116 | +```python |
| 117 | +from django.urls import path |
| 118 | + |
| 119 | +from . import views |
| 120 | + |
| 121 | +urlpatterns = [ |
| 122 | + ... |
| 123 | + path('celery-test/', views.celery_test_view, name='celery-test') |
| 124 | +] |
| 125 | +``` |
| 126 | + |
| 127 | +Now let's test this sample task. Run: |
| 128 | + |
| 129 | +``` |
| 130 | +docker-compose -f docker-compose.dev.yml up --build |
| 131 | +``` |
| 132 | + |
| 133 | +Now navigate to `/api/posts/celery-test/`. You should see the JSON response returned right away, and two seconds later you should see the `"Task is done"` message printed out in the `celery` service logs. Also verify that celery tasks are working in the production environment: |
| 134 | + |
| 135 | +``` |
| 136 | +docker-compose up --build |
| 137 | +``` |
| 138 | + |
| 139 | +### Auto-refresh Celery |
| 140 | + |
| 141 | +Let' make on more optimization for our development environment as it relates to celery. If you have worked with Celery and Django before, you know that making changes to celery requires that celery is restarted. We can add a Django management command that will restart celery when changes to our `backend` codebase are saved. |
| 142 | + |
| 143 | +Django managment commands should be put in Django apps. Let's make a new Django app called `core` following the same steps we took while creating our `posts` app. `core` will serve as an app to put things that are not directly related to any other app logic. |
| 144 | + |
| 145 | +Next, let's add a file called `watch_celery.py`: |
| 146 | + |
| 147 | +*`backend/core/management/commands/watch_celery.py`*: |
| 148 | + |
| 149 | +```python |
| 150 | +""" |
| 151 | +This command allows for celery to be reloaded when project |
| 152 | +code is saved. This command is called in |
| 153 | +`docker-compose.dev.yml` and is only for use in development |
| 154 | +
|
| 155 | +https://avilpage.com/2017/05/how-to-auto-reload-celery-workers-in-development.html |
| 156 | +""" |
| 157 | + |
| 158 | +import shlex |
| 159 | +import subprocess |
| 160 | + |
| 161 | +from django.core.management.base import BaseCommand |
| 162 | +from django.utils import autoreload |
| 163 | + |
| 164 | + |
| 165 | +def restart_celery(): |
| 166 | + cmd = 'pkill -9 celery' |
| 167 | + subprocess.call(shlex.split(cmd)) |
| 168 | + cmd = 'celery worker --app=backend.celery_app:app --loglevel=info' |
| 169 | + subprocess.call(shlex.split(cmd)) |
| 170 | + |
| 171 | + |
| 172 | +class Command(BaseCommand): |
| 173 | + |
| 174 | + def handle(self, *args, **options): |
| 175 | + print('Starting celery worker with autoreload...') |
| 176 | + autoreload.main(restart_celery) |
| 177 | +``` |
| 178 | + |
| 179 | +Now let's change the `command` part of the `celery` service in `docker-compose.dev.yml`: |
| 180 | + |
| 181 | +```yml |
| 182 | + command: bash -c 'python3 manage.py watch_celery' |
| 183 | +``` |
| 184 | +
|
| 185 | +We can verify that this works by changing the text returned by our `celery-test-view` function, and we can also see that the celery service is restarted when we save change changes to our `backend` code. |
| 186 | + |
| 187 | +We will not change the `celery` `command` for `docker-compose.yml`, because we won't be editing code in our production app. |
| 188 | + |
| 189 | +### Flower |
| 190 | + |
| 191 | +Let's add one more container that will help us monitor Celery tasks: `flower`. |
| 192 | + |
| 193 | +Add the following to both `docker-compose.yml` and `docker-compose.dev.yml`: |
| 194 | + |
| 195 | +```yml |
| 196 | + flower: |
| 197 | + image: mher/flower |
| 198 | + container_name: flower_dev_vet |
| 199 | + command: flower --url_prefix=flower |
| 200 | + environment: |
| 201 | + - CELERY_BROKER_URL=redis://redis:6379 |
| 202 | + - FLOWER_PORT=5555 |
| 203 | + ports: |
| 204 | + - 5555:5555 |
| 205 | + networks: |
| 206 | + - main |
| 207 | + depends_on: |
| 208 | + - celery |
| 209 | + - redis |
| 210 | +``` |
| 211 | + |
| 212 | +And then add the following to `dev.conf` and `prod.conf` NGINX configurtion files: |
| 213 | + |
| 214 | +``` |
| 215 | + upstream flower { |
| 216 | + server flower:5555; |
| 217 | + } |
| 218 | +
|
| 219 | +... |
| 220 | +
|
| 221 | + # flower |
| 222 | + location /flower/ { |
| 223 | + rewrite ^/flower/(.*)$ /$1 break; |
| 224 | + proxy_pass http://flower/; |
| 225 | + proxy_set_header Host $host; |
| 226 | + proxy_redirect off; |
| 227 | + proxy_http_version 1.1; |
| 228 | + proxy_set_header Upgrade $http_upgrade; |
| 229 | + proxy_set_header Connection "upgrade"; |
| 230 | + } |
| 231 | +``` |
| 232 | + |
| 233 | +Once we have verified that tasks are also working for our production environment and that we can view the flower dashboard when we visit `/flower` in the browser, let's commit our changes and make a new minor release for our new celery feature. |
| 234 | + |
| 235 | +``` |
| 236 | +git add . |
| 237 | +git commit -m "added celery and redis" |
| 238 | +git checkout develop |
| 239 | +git merge feature-celery |
| 240 | +git checkout -b release-0.0.5 |
| 241 | +git checkout master |
| 242 | +git merge release-0.0.5 |
| 243 | +git tag -a 0.0.5 |
| 244 | +git push --all |
| 245 | +git push --tags |
| 246 | +``` |
0 commit comments