I have deployed simple PubSub Cloud Function trigger using this tutorial: https://medium.com/@milosevic81/copy-data-from-pub-sub-to-bigquery-496e003228a1
For test I pushed large (over 8MB) message to PubSub topic.
As a result Cloud function returned the following error message to the log: Function execution could not start, status: 'request too large'
The issue is, that Cloud Function started to fire up constantly producing constant resource usage and log messages. It stopped only after I manually purged the related PubSub topic.
Is there a mechanism/configuration to prevent such behavior? Ideally PubSub message should not be picked again after Cloud Function trigger execution.
0
votes
Do you have the retry on failure setting enabled? If that's the case the function will be endlessly retried if there's no error handling
– bhito
@bhito I have "retry on failure" setting disabled..
– Michał Herman
1 Answers
1
votes
You reached the quotas of Cloud Functions
Max uncompressed HTTP request size -> 10MB
One solution is to use Cloud Run (the quotas is higher, 32Mb)
For this, you need several changes
- Convert your Cloud Functions in Cloud Run. I wrote an article (not dedicated to this but you have an example in Python), and I presented this at GDG Ahmedabad last month, in GO this time
- Create a push subscription on your PubSub topic and use the Cloud Run HTTPS endpoint in the "push" HTTP field
Cloud Run can handle up to 80 concurrent requests on 1 instances, Cloud Functions only one. Because your request are "big" it might cause memory issues if you process too many request in the same instance. You can control this with Cloud Run with the --concurrency
param. Set it to 1 to have the same behavior as CLoud Functions.