I want to check (programmatically) if any tasks are running on a specific celery worker. I don't care where the solution should be executed, it can be on the airflow-scheduler/db machine or on the airflow worker machine itself.
I've checked this: How do I check if there are DAGs running in Airflow (before restarting Airflow)?
However this will just check for running tasks across all workers. I want to check if a specific worker has no running tasks so I can stop the worker (downscale workers).
I have flower installed as well, and I can monitor Succeeded/Failed tasks but I'm not sure those help me.
Queues are not used but they can be if needed.
Can I monitor the processes to see if their parent is airflow worker/celery or something ?
Any ideas ?