0
votes

How to structure my python rest api (FastAPI) project?

Different api endpoints submit tasks to different celery workers. I want each celery worker to be build as a separate image and all builds are managed by docker-compose.

I tried separating api directory from celery worker directories and put a Dockerfile in each, but I ran into the problem when the task was submitted to the worker from the unauthorized task. Maybe there is a way to fix it, but it would seem to me like a workaround.

Update

my_app/
    docker-compose.yml
    fastapi_app/
        api/
            ...
        app.py
        Dockerfile
    worker_app1/
        core_app_code/
            ...
        Dockerfile
    worker_app2/
        core_app_code/
            ...
        Dockerfile

Main question is, where the tasks should be defined for each worker, so that that fastapi_app could submit them.

1
What is the motivation for separate workers having their own app code (I suppose tasks)? You typically define one set of tasks and then have multiple workers running them from a shared queue.im_baby
They do completely different things. But now I am thinking whether I should serve it as a microservice instead.MrFoot fifer
It doesn't matter that the tasks are different, there's just no need to separate them. Especially if the same API is going to be sending the queue tasks.im_baby

1 Answers

1
votes

You don't need to have two docker file for celery worker and API, you can directly write celery command in docker compose file.

See below example to run celery worker with docker compose file.

version: "3"

services:
  worker:
    build: .  #your celery app path
    command: celery -A tasks worker --loglevel=info  #change loglevel and worker for production
    depends-on:
      - "redis"  #your amqp broker