This is only a demo for celery task serving. It includes:
- Celery worker
- Flower for monitoring
- A simple flask server to submit and monitor tasks
- Dockerfile and docker-compose.yml
- Redis for broker and PostgreSQL for result backend
If you are using docker-compose, please set your own ports in docker-compose.yml
.
Install redis on your local machine, and the default configuration is redis://localhost:6379/0
:
Example:
CELERY_BROKER_URL=redis://localhost:6379/0
CELERY_RESULT_BACKEND=redis://localhost:6379/0
CELERY_FLOWER_METRICS_URL=http://localhost:5555/metrics
pip install -r requirements.txt
celery -A worker.app worker --loglevel=INFO
celery -A worker.app --broker=redis://redis:6379/0 --result-backend=redis://redis:6379/0
You can visit http://localhost:5555
to monitor the celery worker.
FLASK_APP=server flask run
Your flask server will run on http://localhost:5000
.
curl --location 'http://127.0.0.1:5000/submit' \
--header 'Content-Type: application/json' \
--data '{
"timeout": 10,
"args": [
4,
4
]
}'
curl --location 'http://127.0.0.1:5000/asubmit' \
--header 'Content-Type: application/json' \
--data '{
"args": [
4,
4
]
}'
curl --location 'http://127.0.0.1:5000/check/{your-id-from-previous-submit}'
curl --location 'http://127.0.0.1:5000/metrics'
docker build -t celery_task_serving_demo:latest .
docker-compose up
docker-compose up -d