0
votes

I'm trying run beat tasks. But only one task executes. So, there is my project's structure:

trading_platform:

  • trading_platform:
    • celery.py
    • settings.py
  • offers:
    • tasks.py
  • manage.py

celery.py:

import os
from celery import Celery
from django.conf import settings

os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'trading_platform.settings')
app = Celery('trading_platform')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)


@app.on_after_configure.connect
def setup_periodic_tasks(sender, **kwargs):
    sender.add_periodic_task(10.0, debug_task.s('HELLO'), name='add every 10')


@app.task
def debug_task(self):
    print(self)

settings.py:

# Celery Configuration Options
CELERY_BROKER_URL = os.environ.get('CELERY_BROKER_URL', 'redis://redis:6379/0')
CELERY_RESULT_BACKEND = os.environ.get('CELERY_RESULT_BACKEND', 'redis://redis:6379/0')
# CELERY_ACCEPT_CONTENT = os.environ.get('CELERY_ACCEPT_CONTENT', 'application/json')
CELERY_RESULT_SERIALIZER = os.environ.get('CELERY_RESULT_SERIALIZER', 'json')
CELERY_TASK_SERIALIZER = os.environ.get('CELERY_TASK_SERIALIZER', 'json')
CELERY_STORE_ERRORS_EVEN_IF_IGNORED = os.environ.get('CELERY_STORE_ERRORS_EVEN_IF_IGNORED', True)

tasks.py:

from trading_platform.celery import app


@app.on_after_configure.connect
def setup_periodic_tasks(sender, **kwargs):
    sender.add_periodic_task(10.0, print_text.s('text'), name='add every 10 sec')


@app.task(bind=True)
def print_text(text):
    print(text)

command to run: celery -A trading_platform worker -B -l info

So, there is input:

celery_1  | /usr/local/lib/python3.8/site-packages/celery/platforms.py:797: RuntimeWarning: You're running the worker with superuser privileges: this is
celery_1  | absolutely not recommended!
celery_1  | 
celery_1  | Please specify a different user using the --uid option.
celery_1  | 
celery_1  | User information: uid=0 euid=0 gid=0 egid=0
celery_1  | 
celery_1  |   warnings.warn(RuntimeWarning(ROOT_DISCOURAGED.format(
celery_1  |  
celery_1  |  -------------- celery@953a7a853036 v5.0.1 (singularity)
celery_1  | --- ***** ----- 
celery_1  | -- ******* ---- Linux-5.4.0-52-generic-x86_64-with-glibc2.2.5 2020-10-25 09:23:38
celery_1  | - *** --- * --- 
celery_1  | - ** ---------- [config]
celery_1  | - ** ---------- .> app:         trading_platform:0x7f2b34bc2520
celery_1  | - ** ---------- .> transport:   redis://redis:6379/0
celery_1  | - ** ---------- .> results:     redis://redis:6379/0
celery_1  | - *** --- * --- .> concurrency: 12 (prefork)
celery_1  | -- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
celery_1  | --- ***** ----- 
celery_1  |  -------------- [queues]
celery_1  |                 .> celery           exchange=celery(direct) key=celery
celery_1  |                 
celery_1  | 
celery_1  | [tasks]
celery_1  |   . offers.tasks.print_text
celery_1  |   . trading_platform.celery.debug_task
celery_1  | 
celery_1  | [2020-10-25 09:23:39,191: INFO/MainProcess] Connected to redis://redis:6379/0
celery_1  | [2020-10-25 09:23:39,197: INFO/MainProcess] mingle: searching for neighbors

celery_1  | [2020-10-25 09:24:20,632: INFO/Beat] Scheduler: Sending due task add every 10 (trading_platform.celery.debug_task)
celery_1  | [2020-10-25 09:24:20,638: INFO/MainProcess] Received task: trading_platform.celery.debug_task[c384669c-951c-4b67-befd-f4683f048ca2]  
celery_1  | [2020-10-25 09:24:20,641: WARNING/ForkPoolWorker-8] HELLO
celery_1  | [2020-10-25 09:24:20,643: INFO/ForkPoolWorker-8] Task trading_platform.celery.debug_task[c384669c-951c-4b67-befd-f4683f048ca2] succeeded in 0.002888333000100829s: None

So, how we can see only one task is executed (from trading_platform/celery.py). How to fix it?

1

1 Answers

0
votes

In addition to the normal celery worker, you must also start a celery beat process. It's also recommended to not use the -B option. Instead, you should start the worker(s) and beat scheduler separately.

Additionally, for django, you must specify the scheduler as django_celery_beat.schedulers:DatabaseScheduler through django-celery-beat

Add django-celery-beat to your django app, if you haven't already:

# settings.py
INSTALLED_APPS = (
    ...,
    'django_celery_beat',
)

Start the beat scheduler:

$ celery -A proj beat -l INFO --scheduler django_celery_beat.schedulers:DatabaseScheduler

Start the worker:

$ celery -A proj worker -l INFO

Reference: https://docs.celeryproject.org/en/stable/userguide/periodic-tasks.html#starting-the-scheduler

Additionally, because you specify bind=True for your task, you need to modify the signature for it to accept the task object.

@app.task(bind=True)
def print_text(self, text):
    print(text)

Here, self is the task object.