I'm trying to use Celery in my django app with redis as broker.
In my settings file I set CELERY_BROKER_URL='redis://redis:6379' and CELERY_RESULT_BACKEND='redis://redis:6379'
And my docker-compose file looks like this
web:
build:
context: ./web/
dockerfile: Dockerfile
image: &web web
env_file:
- .env
command: "gunicorn web.wsgi:application -w 2 -b :4000"
volumes:
- ./web/:/web
expose:
- "4000"
depends_on:
- db
- redis
- worker
- beat
db:
build:
context: ./database/
dockerfile: Dockerfile
volumes:
- data:/var/lib/postgresql/data
env_file:
- .env
expose:
- "5432"
redis:
build:
context: ./cache/
dockerfile: Dockerfile
expose:
- "6379"
worker:
build:
context: ./web/
dockerfile: Dockerfile
image: *web
command: "celery -A web worker -l debug"
ports: []
depends_on:
- redis
- db
beat:
build:
context: ./web/
dockerfile: Dockerfile
image: *web
command: "celery -A web beat -l info"
ports: []
depends_on:
- redis
- db
When I run docker-compose up the beat service starts well but the workerfails with the error
consumer: Cannot connect to amqp://guest:**@127.0.0.1:5672//: [Errno 111] Connection refused
Somehow the worker service is trying to use rabbitmq as broker
Please someone help shed some lights on what I'm doing wrong here.
rabbitmqas broker andredisas a caching service - Hippolyte Fayol