I'm trying run beat tasks. But only one task executes. So, there is my project's structure:
trading_platform:
- trading_platform:
- celery.py
- settings.py
- offers:
- tasks.py
- manage.py
celery.py:
import os
from celery import Celery
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'trading_platform.settings')
app = Celery('trading_platform')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
@app.on_after_configure.connect
def setup_periodic_tasks(sender, **kwargs):
sender.add_periodic_task(10.0, debug_task.s('HELLO'), name='add every 10')
@app.task
def debug_task(self):
print(self)
settings.py:
# Celery Configuration Options
CELERY_BROKER_URL = os.environ.get('CELERY_BROKER_URL', 'redis://redis:6379/0')
CELERY_RESULT_BACKEND = os.environ.get('CELERY_RESULT_BACKEND', 'redis://redis:6379/0')
# CELERY_ACCEPT_CONTENT = os.environ.get('CELERY_ACCEPT_CONTENT', 'application/json')
CELERY_RESULT_SERIALIZER = os.environ.get('CELERY_RESULT_SERIALIZER', 'json')
CELERY_TASK_SERIALIZER = os.environ.get('CELERY_TASK_SERIALIZER', 'json')
CELERY_STORE_ERRORS_EVEN_IF_IGNORED = os.environ.get('CELERY_STORE_ERRORS_EVEN_IF_IGNORED', True)
tasks.py:
from trading_platform.celery import app
@app.on_after_configure.connect
def setup_periodic_tasks(sender, **kwargs):
sender.add_periodic_task(10.0, print_text.s('text'), name='add every 10 sec')
@app.task(bind=True)
def print_text(text):
print(text)
command to run: celery -A trading_platform worker -B -l info
So, there is input:
celery_1 | /usr/local/lib/python3.8/site-packages/celery/platforms.py:797: RuntimeWarning: You're running the worker with superuser privileges: this is
celery_1 | absolutely not recommended!
celery_1 |
celery_1 | Please specify a different user using the --uid option.
celery_1 |
celery_1 | User information: uid=0 euid=0 gid=0 egid=0
celery_1 |
celery_1 | warnings.warn(RuntimeWarning(ROOT_DISCOURAGED.format(
celery_1 |
celery_1 | -------------- celery@953a7a853036 v5.0.1 (singularity)
celery_1 | --- ***** -----
celery_1 | -- ******* ---- Linux-5.4.0-52-generic-x86_64-with-glibc2.2.5 2020-10-25 09:23:38
celery_1 | - *** --- * ---
celery_1 | - ** ---------- [config]
celery_1 | - ** ---------- .> app: trading_platform:0x7f2b34bc2520
celery_1 | - ** ---------- .> transport: redis://redis:6379/0
celery_1 | - ** ---------- .> results: redis://redis:6379/0
celery_1 | - *** --- * --- .> concurrency: 12 (prefork)
celery_1 | -- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
celery_1 | --- ***** -----
celery_1 | -------------- [queues]
celery_1 | .> celery exchange=celery(direct) key=celery
celery_1 |
celery_1 |
celery_1 | [tasks]
celery_1 | . offers.tasks.print_text
celery_1 | . trading_platform.celery.debug_task
celery_1 |
celery_1 | [2020-10-25 09:23:39,191: INFO/MainProcess] Connected to redis://redis:6379/0
celery_1 | [2020-10-25 09:23:39,197: INFO/MainProcess] mingle: searching for neighbors
celery_1 | [2020-10-25 09:24:20,632: INFO/Beat] Scheduler: Sending due task add every 10 (trading_platform.celery.debug_task)
celery_1 | [2020-10-25 09:24:20,638: INFO/MainProcess] Received task: trading_platform.celery.debug_task[c384669c-951c-4b67-befd-f4683f048ca2]
celery_1 | [2020-10-25 09:24:20,641: WARNING/ForkPoolWorker-8] HELLO
celery_1 | [2020-10-25 09:24:20,643: INFO/ForkPoolWorker-8] Task trading_platform.celery.debug_task[c384669c-951c-4b67-befd-f4683f048ca2] succeeded in 0.002888333000100829s: None
So, how we can see only one task is executed (from trading_platform/celery.py). How to fix it?