1
votes

I am currently setup airflow scheduler in Linux server A and airflow web server in Linux server B. Both server has no Internet access. I have start the initDB in server A and keep all the dags in server A.

However, when i refresh the webserver UI, it keep having error message:-

This DAG isn't available in the webserver DagBag object

How do i configure the dag folder for web server (server B) to read the dag from scheduler (server A)?

I am using bashoperator. Is that Celery Operator is a must?

Thanks in advance

1
My set up slightly different. Both scheduler and webserver are in different server. - i2cute
How exactly does that make product behaviour Different ? - sulabh chaturvedi
Product behavior should be same as I thought. But webserver need to read the dags which stored in proxy (different server) and I having problems on this. How can I share the dags without copying over? - i2cute
Interesting i ran into this when working in virtual environments and not having all packages installed globally when thinking I had them all working virtually - suggesting a setup issue - Glenn Sampson

1 Answers

0
votes

The scheduler has found your dags_folder, and its processes, and is scheduling them accordingly. The webserver however can "see" these processes solely by their existence in the database but can't find them in its dags_folder path.

You need to ensure that the dags_folder for both servers contain the same files, and that both are kept in sync with one another. This is out of scope for Airflow and it won't handle this on your behalf.