I have setup celery, rabbitmq and django web server on digitalocean. RabbitMQ runs on another server where my Django app is not running. When I am trying to add the tasks to the queue using delay I am getting an error AttributeError: ‘ChannelPromise’ object has no attribute ‘value’ From django shell I am adding the task to my message queue.
Tag: rabbitmq
Systemd consumer service starts before broker (RabbitMQ )
I have a RabbitMQ server set up on my raspberry Pi and I want the same device to run a consumer to handle messages to one of my queues. I first tried executing it from the crontab but realized later that running it as a systemd service may be a better idea. This is my first time setting up a
Why does publisher declares queue a in Pika RabbitMQ?
I have gone through the fundamentals of RabbitMQ. One thing I figured out that a publisher does not directly publish on a queue. The exchange decides on which queue the message should be published based on routing-key and type of exchange (code below is using default exchange). I have also found an example code of publisher. In line #9 the
Running RabbitMQ Pika with Quart
I am using the Quart framework, but I also need to use the RabbitMQ Pika connector, but I can’t get them to play nice as they both have infinite loops. Entrypoint: Service Class: The code isn’t even getting to the print(‘Thread created…’) and I don’t understand. From this question I do understand that RabbitMQ isn’t thread-safe, but I don’t understand
Celery with Rabbit MQ Virtual Host not accepting tasks from app.tasks.py in Django
Help needed! PS: I have already created Virtual Hosts using this link Celery and Vhosts settings.py celery.py import os from celery import Celery from django.conf import settings app1.tasks.py Attaching images from my terminal, one of of the worker instance other is of the shell from which I am firing the tasks. Note: I started both after settings and all but
Unable to start Airflow worker/flower and need clarification on Airflow architecture to confirm that the installation is correct
Running a worker on a different machine results in errors specified below. I have followed the configuration instructions and have sync the dags folder. I would also like to confirm that RabbitMQ and PostgreSQL only needs to be installed on the Airflow core machine and does not need to be installed on the workers (the workers only connect to the
Rabbitmq hello world connection only works on localhost (python)
I have this simple code taken from the rabbitmq tutorial (http://www.rabbitmq.com/tutorials/tutorial-one-python.html) It works but if I change localhost with the ip of my computer from my own computer or a computer in the same network: I get this error: I have no idea why, should I change something in the connection? Answer It’s a user grant problem . You are
Python and RabbitMQ – Best way to listen to consume events from multiple channels?
I have two, separate RabbitMQ instances. I’m trying to find the best way to listen to events from both. For example, I can consume events on one with the following: I have a second host, “host2”, that I’d like to listen to as well. I thought about creating two separate threads to do this, but from what I’ve read, pika
How to create a delayed queue in RabbitMQ?
What is the easiest way to create a delay (or parking) queue with Python, Pika and RabbitMQ? I have seen an similar questions, but none for Python. I find this an useful idea when designing applications, as it allows us to throttle messages that needs to be re-queued again. There are always the possibility that you will receive more messages