3
votes

I am using celery's apply_async method to queue tasks. I expect about 100,000 such tasks to run everyday (number will only go up). I am using RabbitMQ as the broker. I ran the code a few days back and RabbitMQ crashed after a few hours. I noticed that apply_async creates a new queue for each task with x-expires set at 1 day. My hypothesis is that RabbitMQ chokes when so many queues are being created. How can I stop celery from creating these extra queues for each task?

I also tried giving the queue parameter to the apply_async and assigned a x-message-ttl to that queue. Messages did go this new queue, however they were immediately consumed and never reached the ttl of 30sec that I had put. And this did not stop celery from creating those extra queues.

Here's my code:

views.py

   from celery import task, chain

    chain(task1.s(a), task2.s(b),)
         .apply_async(link_error=error_handler.s(a), queue="async_tasks_queue")

tasks.py

from celery.result import AsyncResult

@shared_task
def error_handler(uuid, a):
    #Handle error

@shared_task
def task1(a):
    #Do something
    return a

@shared_task
def task2(a, b):
    #Do something more

celery.py

app = Celery(
    'app',
    broker=settings.QUEUE_URL,
    backend=settings.QUEUE_URL,
)

app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)


app.amqp.queues.add("async_tasks_queue", queue_arguments={'durable' : True , 'x-message-ttl': 30000})

From the celery logs:

[2016-01-05 01:17:24,398: INFO/MainProcess] Received task: project.tasks.task1[615e094c-2ec9-4568-9fe1-82ead2cd303b]

[2016-01-05 01:17:24,834: INFO/MainProcess] Received task: project.decorators.wrapper[bf9a0a94-8e71-4ad6-9eaa-359f93446a3f]

RabbitMQ had 2 new queues by the names "615e094c2ec945689fe182ead2cd303b" and "bf9a0a948e714ad69eaa359f93446a3f" when these tasks were executed My code is running on Django 1.7.7, celery 3.1.17 and RabbitMQ 3.5.3.

Any other suggestions to execute tasks asynchronously are also welcome

1

1 Answers

0
votes

Try using a different backend - I recommend Redis. When we tried using Rabbitmq as both broker and backend we discovered that it was ill suited to the broker role.