I’m ripping my hair out with this one.
The crux of my issue is that, using the Django CELERY_DEFAULT_QUEUE
setting in my settings.py
is not forcing my tasks to go to that particular queue that I’ve set up. It always goes to the default celery
queue in my broker.
However, if I specify queue=proj:dev
in the shared_task
decorator, it goes to the correct queue. It behaves as expected.
My setup is as follows:
- Django code on my localhost (for testing and stuff). Executing task
.delay()
‘s via Django’s shell (manage.py shell
) - a remote Redis instance configured as my broker
- 2 celery workers configured on a remote machine setup and waiting for messages from Redis (On Google App Engine – irrelevant perhaps)
NB: For the pieces of code below, I’ve obscured the project name and used proj
as a placeholder.
celery.py
from __future__ import absolute_import, unicode_literals import os from celery import Celery, shared_task os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings') app = Celery('proj') app.config_from_object('django.conf:settings', namespace='CELERY', force=True) app.autodiscover_tasks() @shared_task def add(x, y): return x + y
settings.py
... CELERY_RESULT_BACKEND = 'django-db' CELERY_BROKER_URL = 'redis://:{}@{}:6379/0'.format( os.environ.get('REDIS_PASSWORD'), os.environ.get('REDIS_HOST', 'alice-redis-vm')) CELERY_DEFAULT_QUEUE = os.environ.get('CELERY_DEFAULT_QUEUE', 'proj:dev')
The idea is that, for right now, I’d like to have different queues for the different environments that my code exists in: dev, staging, prod. Thus, on Google App Engine, I define an environment variable that is passed based on the individual App Engine service.
Steps
So, with the above configuration, I fire up the shell using ./manage.py shell
and run add.delay(2, 2)
. I get an AsyncResult
back but Redis monitor clearly shows a message was sent to the default celery
queue:
1497566026.117419 [0 155.93.144.189:58887] "LPUSH" "celery" ...
What am I missing?
Not to throw a spanner in the works, but I feel like there was a point today at which this was actually working. But for the life of me, I can’t think what part of my brain is failing me here.
Stack versions:
python
: 3.5.2celery
: 4.0.2redis
: 2.10.5django
: 1.10.4
Advertisement
Answer
This issue is far more simple than I thought – incorrect documentation!!
The Celery documentation asks us to use CELERY_DEFAULT_QUEUE
to set the task_default_queue
configuration on the celery object.
Ref: http://docs.celeryproject.org/en/latest/userguide/configuration.html#new-lowercase-settings
We should currently use CELERY_TASK_DEFAULT_QUEUE
. This is an inconsistency in the naming of all the other settings’ names. It was raised on Github here – https://github.com/celery/celery/issues/3772
Solution summary
Using CELERY_DEFAULT_QUEUE
in a configuration module (using config_from_object
) has no effect on the queue.
Use CELERY_TASK_DEFAULT_QUEUE
instead.