Is there any way to run a Google Data Fusion pipeline from a Cloud Function (preferably python based)?
The core requirement is, an event based Cloud function will execute whenever a new file arrives within a GCS bucket. The Cloud Function, in turn needs to call a Data Fusion pipeline which loads the GCS bucket file into BigQuery.
To execute the Cloud Function, we can use following:
gcloud functions deploy hello_gcs_generic --runtime python37 --trigger-resource YOUR_TRIGGER_BUCKET_NAME --trigger-event google.storage.object.finalize