I am trying to use packaged dag with Celery Executor, but scheduler and worker are not picking up the job. I have restarted the airflow webserver and airflow scheduler but still no success. I have even reset the DB with airflow resetdb but still nothing.
I am getting the following messages:-
[INFO] Handling signal: ttou
[INFO] Worker exiting (pid: 31418)
[INFO] Handling signal: ttin
[INFO] Booting worker with pid: 32308
DAGs are not running manually or even picked by the scheduler.
My zip file has the following contents:
unzip alerting.zip
creating: airflow_utils/
inflating: airflow_utils/enums.py
inflating: airflow_utils/psql_alerting_dag.py
extracting: airflow_utils/__init__.py
inflating: airflow_utils/hive_alerting_dag.py
inflating: airflow_utils/alerting_utils.py
inflating: alerting_12hrs.py
inflating: alerting_15mins.py
inflating: alerting_3hrs.py
If I place all these files in dags folder instead of packaging them, airflow scheduler is able to schedule the dags.
What is that I am doing wrong with packaged dags?

/dagsdirectory? - cwurtz/dagdirectory but if zip (packaged dag) is kept, it only runs the first task in the dag. - DrGeneral