Skip to content
Advertisement

Tag: celery

Django celery error while adding tasks to RabbitMQ message queue : AttributeError: ‘ChannelPromise’ object has no attribute ‘__value__’

I have setup celery, rabbitmq and django web server on digitalocean. RabbitMQ runs on another server where my Django app is not running. When I am trying to add the tasks to the queue using delay I am getting an error AttributeError: ‘ChannelPromise’ object has no attribute ‘value’ From django shell I am adding the task to my message queue.

Celery jobs not running on heroku (python/django app)

I have a Django app setup with some scheduled tasks. The app is deployed on Heroku with Redis. The task runs if invoked synchronously in the console, or locally when I also have redis and celery running. However, the scheduled jobs are not running on Heroku. My task: celery.py: In Procfile: worker: celery -A my_app worker –beat -S django -l

Module ‘project_name’ has no attribute ‘celery’

I’m trying to set up a background task using celery and rabbitmq on django but I’m getting an error saying that my project has no attribute celery. I’m using PyCharm and installed celery through that. I’m new to celery but I’ve read a lot of articles similar to this issue (AttributeError: module ‘module’ has no attribute ‘celery’ this one seems

python celery monitoring events not being triggered

I have a following project directory: tasks.py main.py monitor.py Started celery worker with And started monitor.py with But Only worker-online event is being triggered, while other events like task-succeeded is not triggered or handled. What am I missing here? Answer Enable worker task- group events with cli option -E or –task-events and try to capture all events:

python celery invalid value for -A unable to load application

I have a following project directory: task.py main.py When I run a command to initialize celery workers, I get this error: But when I rename main.py to celery.py as it was earlier there’s no issue. What am I missing here? Answer There are two approaches import your app to azima/__init__.py You can omit the last line, celery will recognize the

Run a same celery task in loop

how to run this kind of celery task properly? I need all tasks to be performed sequentially after the previous one is completed. I tried using time.sleep() but in this case returning result waits until all tasks are completed. But I need the result returned and all 10 tasks are running sequentially in the background. there is a group() in

Advertisement