We have 3 composer GKE nodes and 3 worker pods evenly distributed in all 3 nodes. I need to know how to check which DAG or task is currently running in which pod. I tried to run airflow list_dag but it is showing all dags i think. I just need to know which dag is running in which pod. Is it is possible to move the pod from one node to another. Sometimes my pods are not evenly distributed in all 3 nodes.
1 Answers
0
votes
There are two ways of getting this information from the Airflow UI :
You can go to Browse -> Task Instances and fetch the hostname to know on which worker the task run. You can also apply filters etc.
Or you can go to Data Profiling to run an Ad Hoc Query such as the following one:
SELECT dag_id, task_id, state, hostname
FROM task_instance
WHERE state = "running"