I'm trying to import a script in many DAGs to call the same operation multiple times. What is the best way to apply this kind of solution?
Right now I have a folder structure as:
dags/
|-- some_dags_folder/
|---- some_dag.py
|-- other_dags_folder/
|---- another_dag.py
|-- utils/
|---- util_slack.py
When I try to import the util_slack file I place into the DAG code the following, for the example suppose the code is from some_dag.py:
from ..utils.util_slack import some_function
After place everything inside Airflow I get the following error:
Broken DAG: [/usr/local/airflow/dags/some_dags_folder/some_dag.py] attempted relative import with no known parent package
The util_slack script is a file made to send either a success message or a fail message and it looks like this
from airflow.contrib.operators.slack_webhook_operator import SlackWebhookOperator
from airflow.hooks.base_hook import BaseHook
CHANNEL = BaseHook.get_connection('Slack').login
TOKEN = BaseHook.get_connection('Slack').password
def slack_success(context):
...
alterHook = SlackWebhookOperator(...)
return alterHook.execut(context=context)
def slack_fail(context):
...
alterHook = SlackWebhookOperator(...)
return alterHook.execut(context=context)
The idea is that I can import the util_slack module or any other self-made module into multiple DAGs and invoke the function I need as
...
from ..utils.util_slack import slack_success
...
def task_success(context):
return slack_success(context)
...
some_task_in_dag = SSHOperator(
...
on_success_callback=task_success
...)
Is this the best approach or is it better to create custom plugins like the ones showed at https://airflow.apache.org/plugins.html?