I have configured Celery to run async jobs for a Flask application on a dev box like this:
config.py:
class CeleryConfig(object):
CELERY_BROKER_URL = 'redis://localhost:6379/0'
CELERY_RESULT_BACKEND = 'redis://localhost:6379/0'
CELERY_CONFIG = CeleryConfig
manage.py:
celery_app = celery.Celery(config_source=app.config.get('CELERY_CONFIG'))
def run_celery():
appl = celery.current_app._get_current_object()
celery_worker = celery_worker.worker(app=appl)
options = {
'broker': config.get('CELERY_CONFIG').CELERY_BROKER_URL,
'traceback': True,
}
celery_worker.run(**options)
prior to starting the application I start redis:
./redis-server --daemonize yes
Then when I run the application (run_celery) I get the following Celery config displayed:
- ** ---------- .> transport: amqp://guest:**@localhost:5672//
- ** ---------- .> results: redis://localhost:6379/0
and the following recurring error:
ERROR/MainProcess consumer: Cannot connect to amqp://guest:**@127.0.0.1:5672//: [Errno 111] Connection refused.
I am not sure why the transport layer is using RabbitMQ and why I can't get Celery started.