0
votes

When running the following operator a folder called my_folder is created in the Airflow worker. This makes sense since it is the worker executing the task.

run_this = BashOperator(
    task_id='run_this',
    bash_command=f'mkdir my_folder'
)

However, I'd like to create this folder both in the webserver and scheduler. Is there an easy way to do this?

My idea is to update other dags from a dag by copying them from s3, but being able to do this is a first step.

1

1 Answers

0
votes

One thing that comes to mind is to mount a shared volume that all components can access: worker/scheduler/webserver and update your dag to create that folder in the shared volume like so:

run_this = BashOperator(
    task_id='run_this',
    bash_command=f'mkdir /shared_volume/my_folder'
)

How you do this is totally dependent on how you're deploying airflow.