I’m using autodiscover in my celery.py file for gathering tasks. Which up until a recently picked all of the app.tasks.py up, I’m not sure why but my config.tasks.py function are no longer being picked up, but all the other apps are. If I from “config.tasks import *” there are no errors and I can run the tasks manually through shell.
Tag: celery
Run task from another periodic task with celery
I have periodic task which should trigger another task. Final expected behavior: first task should collect some data from external service and then loop over this data (list) and call another task with passing over argument (current iteration in loop). I want to have those tasks in loop being asynchronical. I wrote code that runs a task in period, but
When is on_failure called by celery when retrying
I’m using Task Inheritance in celery to retry (max_retries: 3) on certain exceptions, and log failures. Is on_failure called on each failed attempt or only after the last attempt (the 3rd in my case)? Answer Tested this and on_failure is only run after the retries have all failed. So, using the example given above, on_failure is called after the 3rd
How to hide Celery task id from structlog?
I used Structlog and Celery in my Django application and both work very well, but I would like to prevent task_id from appearing in the log when the worker executes tasks. How can I do that? The reason is that task_id is a key:value of 36 characters long so it makes log output hard to read. This is how my
Django+Celery Module not found error with app installed from git
I’m have a project split on 2 separate apps and both have another shared app as reusable one(i store my models there and install it from git link. So when i run celery worker locally all is working perfect, but if i use docker(with celery in separate image) i got this error In some reason celery gets improper configuration from
Celery with Rabbit MQ Virtual Host not accepting tasks from app.tasks.py in Django
Help needed! PS: I have already created Virtual Hosts using this link Celery and Vhosts settings.py celery.py import os from celery import Celery from django.conf import settings app1.tasks.py Attaching images from my terminal, one of of the worker instance other is of the shell from which I am firing the tasks. Note: I started both after settings and all but
in Celery/Django : cannot find reference ‘control’ in celery.task.control
I’m trying to use celery in my project. when I use from celery.task.control import revoke the PyCharm highlight control and warn me cannot find reference ‘control’ in __init__.py and also PyCharm adds broken line under revoke and warn me Unresolved reference revoke. But when I run the project, celery is working great without any problem with calling tasks or revoking
Celery –pool=threads — what does this do and how do I use it properly?
I’m hitting a segfault error while running a task using Celery. After looking up the issue, it seems others are solving similar issues by starting celery with –pool=threads. When I try passing –pool=threads I get ModuleNotFoundError: No module named ‘threads’ I don’t believe this is the same as the thread module which would throw the error of No module named
How start celery worker in Django project
I have a Django project with the directory structure mentioned below. I am trying to use Celery for running tasks in the background. I have facing some trouble while running the worker. Whenever I issue the following command, I get an error. Command From the projectdirectory where manage.py resides ModuleNotFoundError: No module named ‘tasks’ From the projectdirectory where celery.py resides
Django Celery delay() always pushing to default ‘celery’ queue
I’m ripping my hair out with this one. The crux of my issue is that, using the Django CELERY_DEFAULT_QUEUE setting in my settings.py is not forcing my tasks to go to that particular queue that I’ve set up. It always goes to the default celery queue in my broker. However, if I specify queue=proj:dev in the shared_task decorator, it goes