0
votes

I’m building a project using Flask, Celery and Docker. The idea is to run time consuming processes from rest calls using celery and most of them involve calls to external apis.

First of all the problem I have is that when I start the container, tasks don't run at all and I don’t see anything in the logs except for:

INFO/MainProcess] Connected to redis://redis:6379/0
INFO/MainProcess] mingle: searching for neighbors
INFO/MainProcess] mingle: all alone
INFO/MainProcess] celery@34569b50965e ready.

I’m using a docker container for the flask application, another for the celery worker, and another for redis as a broker (which is used by celery and by flask-socketio.

  • the flask app container has on it the celery definition and instance
  • the celery container uses the flaskapp image but runs this command after activating the virtualenv: celery worker -A app.controller.celery -l info

Then I open the celery container’s log: docker logs server_celery_1 -f in order to monitor that the tasks are running.

Then I go open postman, make a request to the rest service in the flask app which delegates the task to celery,… but nothing happens.

Here is the code involved:

def make_celery(_app):
    celery = Celery(
        _app.import_name,
        backend=_app.config['redis://redis:6379/1'],
        broker=_app.config['redis://redis:6379/0']
    )
    celery.conf.update(_app.config)


    class ContextTask(celery.Task):
        def __call__(self, *args, **kwargs):
            with _app.app_context():
                return self.run(*args, **kwargs)

        celery.Task = ContextTask
        return celery


celery = make_celery(app)



@app.route('/module/run', methods=['POST'])
@jwt_required
def run_module():
    req = request.get_json()
    module = req.get('module')
    json_input = req.get('input')
    logger.info('Running module: ' + req.get('module'))
    res = do.delay(module, json_input)
    return JSONEncoder().encode({'status': 'Task: ' + str(res) + ' submitted.'})


@celery.task()
def do(module_name, json_input):
    logger.info('____ Running ____')
    modules.run(module_name, json_input)

**BUT ** IF I open the Celery events command line app to monitor the tasks (which looks like it’s not relevant when using redis… or is it?)

celery -A app.controller.engine.celery events

and run some tasks… (and nothing happens)… when I exit the celery events window by pressing CTRL+C a twice and suddenly I see logs in the logs feed for the celery container, the logs started to appear and the tasks start to run.

What am I missing?

Thanks a lot!

1
Are you sure the broker url is correct? I would have expected sth like redis://redis:6379/0Bjorn Stiel
It uses the 0 by default if not specified as far as I know. But I added it now, I set up 0 as the broker, 1 for backend and 2 for socket io. Anyways same thing, all tasks are pending and don't run unless something happens (like running and exiting from the events gui).... still cannot understand why.magnoz
I've realized that sometimes It works right after I start the containers, sometime it doesn't. No errors logged. Very weird.magnoz
I also see that, right after starting the containers, tasks submitted doesn't start, until I enter into the celery container and run any celery command, ie celery -A app.controller.engine.celery inspect stats, then the thing "unblocks" and start working..... any ideas?magnoz

1 Answers

1
votes

I kind of fixed this rare issue by starting the celery worker using eventlet:

celery worker -A app.controller.engine.celery -l info --concurrency=2 --pool eventlet

I don't fully understand why it doesn't work with the default one though.

I'd appreciate if anyone can bring some light on this.

Thanks anyways