6
votes

I am newbie to airflow, We have a DAG with 3 tasks. Currently we are using Celery Executor as we need the flexibility to run an individual task. We don't want to schedule the workflow, for now it will be manual trigger. Is there any way to execute the entire workflow using the Airflow UI (Same as we have in oozie)?

Executing one task at a time is a pain.

2
To clarify, do you want all three of the tasks to run when you run the first task? Also please post your relevant code.Daniel Lee
If you set the dependencies and then run the dag from the command line with airflow trigger_dag id what is the issue?Daniel Lee
Yes you got that right. We need to run all the task when we run the first. Sry due to restrictions it won't be possible to post the code. We are going to hand over the DAG to support team where their job would be to manually trigger the workflow. As they don't have much experience with command line we need to execute it via UI.user1432155
Run airflow schedule in a separate thread (next to web server). Then you'll be able to manually trigger DAGKhozzy

2 Answers

7
votes

In Airflow 1.8 and higher there is a button for each dag on the dashboard that looks like a play button:

play button

In older versions of Airflow, you can use the dialog found at:

Browse -> Dag Runs -> Create

Either one should kick off a dag from the UI.

0
votes

I'll take a stab at it and hopefully you can adapt your code to work with this.

default_args = {
'owner': 'airflow',
'depends_on_past': False,
'start_date': datetime(2015, 6, 1),
'email': ['[email protected]'],
'email_on_failure': False,
'email_on_retry': False,
'retries': 1,
'retry_delay': timedelta(minutes=5),
# 'queue': 'bash_queue',
# 'pool': 'backfill',
# 'priority_weight': 10,
# 'end_date': datetime(2016, 1, 1),
}

dag = DAG('your_dag', default_args=default_args)

#start of your tasks

first_task = BashOperator(task_id='print_date',
                          bash_command='python script1_name args',
                          dag=dag)
second_task = BashOperator(task_id='print_date',
                          bash_command='python script2_name args',
                          dag=dag)
third_task = BashOperator(task_id='print_date',
                          bash_command='python script_name args',
                          dag=dag)

#then set the dependencies
second_task.set_upstream(first_task)
third_task.set_upstream(second_task)

Then when you trigger the DAG, all three tasks will run in order. If these tasks are not dependent on each other, you can remove the set_upstream() lines from the script. Note that all of these tasks must be in the same script to run with one command.