3
votes

Is it possible to configure the cloud function trigger-bucket parameter to be a folder in a GCS bucket?

For example, imagine we have the following:

gs://a_bucket/a_folder

Instead of setting --trigger-bucket gs://a_bucket when deploying, I need to set it at the folder level i.e. --trigger-bucket gs://a_bucket/a_folder/.

However, I get the error:

ERROR: (gcloud.beta.functions.deploy) argument --trigger-bucket: Invalid value 'gs://a_bucket/a_folder/': Bucket must only contain lower case Latin letters, digits and characters . _ -. It must start and end with a letter or digit and be from 3 to 232 characters long. You may optionally prepend the bucket name with gs:// and append / at the end.

https://cloud.google.com/sdk/gcloud/reference/beta/functions/deploy

1

1 Answers

4
votes

It is not possible to set the trigger at the folder level.

However, one workaround I found is that you can perform it programmatically in the Cloud Function by hooking into the name attribute of the change notification as it contains the folder(s) and the filename.

E.g. a a_sample.txt file which is located in gs://a_bucket/a_folder/, the name attribute will contain a_folder/a_sample.txt as a string value.

So, you can then filter on the folder you are interested in. The trade-offs:

  1. It's not pretty!
  2. Your cloud function will be triggered for all bucket events - even the ones you are not interested in.

If you can live with that, then it's the way to go (until Google support triggering at the folder level).