When running the following operator a folder called my_folder
is created in the Airflow worker. This makes sense since it is the worker executing the task.
run_this = BashOperator(
task_id='run_this',
bash_command=f'mkdir my_folder'
)
However, I'd like to create this folder both in the webserver and scheduler. Is there an easy way to do this?
My idea is to update other dags from a dag by copying them from s3, but being able to do this is a first step.