0
votes

I have a google cloud tasks queue that processes 1000's of HTTP requested (cloud functions). I've set up the task queue with the default settings except updated "Max attempts" = 2

enter image description here

Each task is dispatched using python using "from google.cloud import tasks_v2" package.

the issue I'm facing is that it just takes too long to finish processing all the tasks within the queue, I would have expected with a setting of "max concurrent" = 1000 I would see more tasks running in one go? When refreshing all observing the "running tasks" indicator I've only seen it at a maximum of 15.

enter image description here

enter image description here

Have I missed something or are there other settings I can play with to get these tasks process quicker?

1
Have you capped the max instance of your cloud function? - MBHA Phoenix
@MBHAPhoenix - thanks for your help. I had a look and the auto scalling setting was set to default with no value. I'm going to test larger values and see if that makes a difference. - Dimo
I tried to play around with these settings but didn't see much difference. You got me thinking about my cloud function. Perhaps cloud tasks back off if / when it sees errors. I'm going to look into suppressing any 500 errors and see if that makes a difference. - Dimo

1 Answers

0
votes

It turns out that the issues had to do with my cloud function. I have try-catch statements that would return a status of 500 when an error occurred.

It seems that cloud tasks will back off when it sees an increase in error responses. I ended up changing my catch statements to return a 200 and my task queue is finishing substantially quicker now.

Hope this helps some else in the future.