L
L
Leo2014-10-24 11:10:48
Python
Leo, 2014-10-24 11:10:48

Why isn't Celery returning a RabbitMQ response?

I am using Celery and RabbitMQ in my application.
Everything works great on localhost:
In call.py I use groups , a request is made to several workers, they consider the result, return it, get it in the same call.py .
The problems start when I run the workers on a different host.
I run RabbitMQ and call.py on macbook, Celery-workers on PC (windows).
The workers receive messages, calculate the result (succeeded everywhere), but for some reason do not send these results to the backend (although it is specified in the settings, otherwise the result would not have been sent to localhost either).
No errors in Celery logs, nothing. RabbitMQ logs - nothing interesting (accepting/closing connections).
The poppy has an IP of 192.168.1.14, which I also use in the broker and backend parameters when I create a Celery instance. (firewall on mac turned off).
In rabbitmq-env.conf :
NODE_IP_ADRESS=192.168.1.14
In the router settings, I forwarded port 5672 to the macbook.
Even when I send my public ip (5.57.NN) to the broker and backend, to localhost (when both RabbitMQ and Celery) - everything is great, I get the answer calculated by the workers, it’s worth moving these workers to another machine - everyone thinks, that's just why - they don't want to send or they can't.
What is the problem, how to fix it? (I'm banging my head for the third day, I can't understand)
Thank you!

Answer the question

In order to leave comments, you need to log in

1 answer(s)
L
Leo, 2014-10-25
@STLEON

Strange as it may seem, I launched the workers on a Linux distribution, and everything immediately worked - the answer comes.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question