Latest Apache-Airflow install from PyPy (1.9.0)
Set up includes:
- Apache-Airflow
- Apache-Airflow[celery]
- RabbitMQ 3.7.5
- Celery 4.1.1
- Postgres
I have the installation across 3 hosts.
Host #1
- Airflow Webserver
- Airflow Scheduler
- RabbitMQ Server
- Postgres Server
Host #2
- Airflow Worker
Host #3
- Airflow Worker
I have a simple DAG that executes a BashOperator Task that runs every 1 minute. I can see the scheduler "queue" the job however, it nevers gets added to a Celery/RabbitMQ queue and gets picked up by the workers. I have a custom RabbitMQ user, authentication seems fine. Flower, however, doesn't show any of the queues populating with data. It does see the two worker machines listening on their respective queues.
Things I've checked:
- Airflow Pool configuration
- Airflow environment variables
- Upgrade/Downgrade Celery and RabbitMQ
- Postgres permissions
- RabbitMQ Permissions
- DEBUG level airflow logs
I read the documentation section about jobs not running. My "start_date" variable is a static date that exists before the current date.
OS: Centos 7