6
votes

I recently upgraded from airflow 1.9 to 1.10 and performed the following commands:

  • airflow upgradedb
  • changed all my celery config names mentioned here
  • export SLUGIFY_USES_TEXT_UNIDECODE=yes
  • added: log_filename_template = {{ ti.dag_id }}/{{ ti.task_id }}/{{ execution_date.strftime("%%Y-%%m-%%dT%%H:%%M:%%S") }}/{{ try_number }}.log to my config

Jobs seem to be running fine, but when I click logs don't appear in DAG task nodes.

enter image description here

I opened my network tab and a request to the following url is returning this JSON

$AIRFLOW_URL/ariflow/get_logs_with_metadata?dag_id=xxxx&task_id=xxxxx&execution_date=2018-09-09T23%3A03%3A10.585986%2B00%3A00&try_number=1&metadata=null

{"error":true,"message":["Task log handler file.task does not support read logs.\n'NoneType' object has no attribute 'read'\n"],"metadata":{"end_of_log":true}}

Additionally there is a 404 request to get js/form-1.0.0.js. Any advice on extra steps to get logs reworking?

I can confirm that logs are showing up in the logs directory for tasks on the airflow server.

1

1 Answers

11
votes

Using https://github.com/apache/incubator-airflow/blob/master/airflow/config_templates/default_airflow.cfg

I previously had

task_log_reader = file.task

and changed it to:

task_log_reader = task

As well I added:

log_filename_template = {{ ti.dag_id }}}}/{{ ti.task_id }}/{{ ts }}}}/{{ try_number }}.log
log_processor_filename_template = {{ filename }}.log