Send Celery Tasks to another machine
Note: Make sure you are in python 3.6, you could follow this post to install conda and manage your environment properly
Using Celery & RabbitMQ to demo.
On Machine A:
- Install Celery & RabbitMQ.
- Clone this repository and update the
username
,password
,ip
andvhost
as you need. - Configure RabbitMQ so that Machine B can connect to it.
# add new user
sudo rabbitmqctl add_user <user> <password>
# add new virtual host
sudo rabbitmqctl add_vhost <vhost_name>
# set permissions for user on vhost
sudo rabbitmqctl set_permissions -p <vhost_name> <user> ".*" ".*" ".*"
# restart rabbit
sudo rabbitmqctl restart
- In your python3.6 environment, launch the python console and do:
from remote import add
a = add.delay(3, 3)
On Machine B
- Install Celery
- Clone this repository and update the
username
,password
,ip
andvhost
as you need. - Run the worker
celery worker -l info -A remote
As soon as you run the above command, you should see the worker received tasks and executed. And if you go back to machineA, and check a.status
and a.result
, you should see that task is finished and result is returned.
Using Celery & Redis
On Machine A:
- Install Celery & RabbitMQ.
- Clone this repository and update the
ip
,port
andredisdb
as you need. - Install
redis
and configure it properly as covered here - Enter your python3.6 environment, install redis by running
pip install redis
- In your python3.6 environment, launch the python console and do:
from remote import add
a = add.delay(3, 3)
On Machine B
- Install Celery
- Clone this repository and update the
username
,password
,ip
andvhost
as you need. - Enter your python3.6 environment, install redis by running
pip install redis
- Run the worker
As soon as you run the above command, you should see the worker received tasks and executed. And if you go back to machineA, and check a.status
and a.result
, you should see that task is finished and result is returned.