Background
- I have created an Airflow webserver using a Composer Environment within Google Cloud Platform. i.e. 3 nodes, composer-1.10.0-airflow-1.10.6 image version, machine type n1-standard-1.
- I have not yet configured any networks for this environment.
- The Airflow works fine for simple test DAGs, i.e.:
The problem
- I wrote a ping_ip DAG for determining whether a physical machine (i.e. my laptop) is connected to the internet. (Code: https://pastebin.com/FSBPNnkP)
- I tested the python used to ping the machine locally (via
response = os.system("ping -c 1 " + ip_address)) and it returned 0, aka Active Network. - When I moved this code into an Airflow DAG, the code ran fine, but this time returned 256 for the same IP address.
Here's the DAG code in a pastebin: https://pastebin.com/FSBPNnkP
Here are the Airflow Logs for the triggered DAG pasted above:
[2020-04-28 07:59:35,671] {base_task_runner.py:115} INFO - Job 2514: Subtask ping_ip 1 packets transmitted, 0 received, 100% packet loss, time 0ms
[2020-04-28 07:59:35,673] {base_task_runner.py:115} INFO - Job 2514: Subtask ping_ip [2020-04-28 07:59:35,672] {logging_mixin.py:112} INFO - Network Error.
[2020-04-28 07:59:35,674] {base_task_runner.py:115} INFO - Job 2514: Subtask ping_ip [2020-04-28 07:59:35,672] {python_operator.py:114} INFO - Done. Returned value was: ('Network Error.', 256)
- I guess I have Networking issues for external IPs in my server.
- Does anybody know how to ping an external IP from within an Airflow Service managed by GCP?
- The end goal is to create a DAG that prompts a physical machine to run a python script. I thought this process should start with a simple sub-DAG that checks to see if the machine is connected to the internet. If I'm going about this the wrong way, please lemme know.
