I'm stuck with running celery 3.1.17 on windows 7 (and later on 2013 server) using redis as backend.
In my celery.py
file I defined an app with one scheudled task
app = Celery('myapp',
backend='redis://localhost',
broker='redis://localhost',
include=['tasks']
)
app.conf.update(
CELERYBEAT_SCHEDULE = {
'dumdum': {
'task': 'tasks.dumdum',
'schedule': timedelta(seconds=5),
}
}
)
The task is writing a line to a file
@app.task
def dumdum():
with open('c:/src/dumdum.txt','w') as f:
f.write('dumdum actually ran !')
Running the beat service from the command line
(venv) celery beat -A tasks
celery beat v3.1.17 (Cipater) is starting.
__ - ... __ - _
Configuration ->
. broker -> redis://localhost:6379/1
. loader -> celery.loaders.app.AppLoader
. scheduler -> celery.beat.PersistentScheduler
. db -> celerybeat-schedule
. logfile -> [stderr]@%INFO
. maxinterval -> now (0s)
[2015-03-15 10:50:33,265: INFO/MainProcess] beat: Starting...
[2015-03-15 10:50:35,496: INFO/MainProcess] Scheduler: Sending due task dumdum (tasks.dumdum)
[2015-03-15 10:50:40,513: INFO/MainProcess] Scheduler: Sending due task dumdum (tasks.dumdum)
Looks promising, BUT NOTHING HAPPENS. Nothing is being writen to the file.
The celery documentation on runnig beat on windows reference this article from 2011. The article explains how to run celeryd
as a scheduler task on windows. celeryd
has been deprecated since and the command stated in the article is no longer working (there is no celery.bin.celeryd
module).
So, What is the solution here ?
Thanks.
celery worker -A tasks -l info
which starts a worker instance and starts cosuming the tasks you have just queued. - Pandikunta Anand Reddy