0
votes

I have Airflow deployed on a Kubernetes cluster via terraform in Azure.

The DAGs are able to run successfully and the logs are written into the Azure Storage Container.

However, in the Airflow UI when I try to access logs for a task, it is unable to fetch logs from the same container. This is the error it gives.

*** Log file does not exist: /usr/local/airflow/logs/<dag_name>/<task_name>/2020-11-10T23:19:17.444280+00:00/1.log
*** Fetching from: http://xxxxxx:8793/log/<dag_name>/<task_name>/2020-11-10T23:19:17.444280+00:00/1.log
*** Failed to fetch log file from worker. HTTPConnectionPool(host='xxxxxx', port=8793): Max retries exceeded with url: /log/<dag_name>/<task_name>/2020-11-10T23:19:17.444280+00:00/1.log (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fac1bd99310>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution'))

I ensured all the credentials and names were correct, this is strange for it is able to write at the same location but unable to read.

Any leads on how to resolve this will be highly appreciated. Thanks in advance!

1

1 Answers

0
votes

This was resolved by rolling out restart of deployments on the Kubernetes cluster.