1
votes

I'm running Celery as a Flask microservice where it has tasks.py with tasks and manage.py contains the call to run the flask server.

This is part of the manage.py


class CeleryWorker(Command):
    """Starts the celery worker."""
    name = 'celery'
    capture_all_args = True

    def run(self, argv):
        if "down" in argv:
            ret = subprocess.call(
                ['pkill', '-9', '-f', "my_app.celery"])
            sys.exit(ret)
        else:
            ret = subprocess.call(
                ['celery', 'worker', '-A', 'my_app.celery'] + argv)
            sys.exit(ret)


manager.add_command("celery", CeleryWorker())

I can start the service with either python manage.py runserver or `celery worker -A my_app.celery and it runs perfectly and registers all tasks in tasks.py.

But in production, i want to handle multiple requests to this microservice and want to add gunicorn to serve those requests. How do i do it?

I'm not able to figure out how i can run both my gunicorn command and celery command together.

Also, i'm running other api services using gunicorn in production from its create_app, since i dont need them to run the celery command.

1

1 Answers

1
votes

Recommend to use Supervisor, which allow you to control a number of processes.

step1: pip install supervisor

step2: vi supervisord.conf

[program:flask_wsgi]
command=gunicorn -w 3 --worker-class gevent wsgi:app 
directory=$SRC_PATH
autostart=true

[program:celery]
command=celery worker -A app.celery --loglevel=info
directory=$SRC_PATH
autostart=true

step3: run supervisord -c supervisord.conf