1
votes

With celery, I have created listeners to Redis for getting all write events to Redis. Based on the events, I will trigger celery tasks to migrate data from Redis to DB. I'm using the eventlet pool along with concurrency of 1000. Also, I'm having 5 celery queues for processing my data.

celery -A proj worker -l info -P eventlet -c 1000 -Q event_queue,vap_queue,client_queue,group_queue,ap_queue

Here, I'm facing the problem like, the listener is able to receive all the write events from Redis and workers are able to receive tasks from the listener. But, celery workers are delaying while processing huge number of data. (For example, I will be receiving 800 tasks per 10 seconds for each queue)

I have tried by increasing concurrency to higher values, changing the pool from eventlet to gevent and prefetch multiplier to 1. Still, My workers are delaying to complete a task.

Can anyone help to solve this? I'm new to celery actually :)

1

1 Answers

0
votes

Some times concurrency is not the main factor in speeding up the task consumption. When these tasks are processed. Infact too much concurrency can lead to many context switches and slow down things, monitor your server CPU and memory to check if they are not getting overwhelmed by the tasks and find an optimum number. For CPU bound task I will say prefer more worker than concurrent threads and for I/O bound tasks you can have concurrent threads