I have 10,000 images added to my google storage bucket on a daily basis.
This bucket has a cloud function event which triggers BigQuery scans
My cloud function checks for existing records in BigQuery 10,000 times a day, which is running up my BigQuery bill to unsustainable amount.
I would like to know if there is a way to query the database once and store the results into a variable which can then be available for all triggered cloud functions?
In summary: query DB once, store query results, use query results for all cloud function invocations. This way I do not hit BigQuery 10,000+ times a day.
P.S. BigQuery processed 288 Terabytes of data which is an insane bill $$$$$$$$$$$$