0
votes

I have a Django application that uses Celery with Redis broker for asynchronous task execution. Currently, the app has 3 queues (& 3 workers) that connect to a single Redis instance for communication. Here, the first two workers are prefork-based workers and the third one is a gevent-based worker.

The Celery setting variables regarding the broker and backend look like this:

CELERY_BROKER_URL="redis://localhost:6379/0"
CELERY_RESULT_BACKEND="redis://localhost:6379/1"

Since Celery uses rpush-blpop to implement the FIFO queue, I was wondering if it'd be correct or even possible to use different Redis databases for different queues like — q1 uses database .../1 and q2 uses database .../2 for messaging? This way each worker will only listen to the dedicated database for that and pick up the task from the queue with less competition.

  • Does this even make any sense?
  • If so, how do you implement something like this in Celery?
1
why did you choose Redis as the broker and not RabbitMQ? what's the load it is supposed to handle? - ItayB
Mostly simplicity. Celery Redis config seemed dead-simple to me and I just picked that. - Redowan Delowar

1 Answers

1
votes

First, if you are worried about the load, please specify your expected numbers/rates.

In my opinion, you shouldn't be concerned about the Redis capability to handle your load.

  1. Redis has its own scale-out / scale-in capabilities whenever you'll need them.
  2. You can use RabbitMQ as your broker (using rabbitMQ docker is dead-simple as well, you can see example) which again, has its own scale-out capabilities to support a high load, so I don't think you should be worried about this point.

As far as I know, there's no way to use different DBs for Redis broker. You can create different Celery applications with different DBs but then you cannot set dependencies between tasks (canvas: group, chain, etc). I wouldn't recommend such an option.