I'm trying to use cloud functions to update data by calling an external API once a day.
So far I have:
Cloud Schedule set to invoke Function 1
Function 1 - loop over items and create a task for each item
Task - invoke Function 2 with data provided by function 1
Function 2 - call external API to get data and update our db
The issue is that there are ~2k items to update daily and a cloud function times out before it can do that, hence why I put them in a queue. But even placing the items in the queue takes too long for the cloud function so that is timing out before it can add them all.
Is there a simple way to bulk add multiple tasks to a queue at once?
Failing that, a better solution to all of this?
All written in python
Code for function 1:
def refresh(request):
for i in items:
# Create a client.
client = tasks_v2.CloudTasksClient()
# TODO(developer): Uncomment these lines and replace with your values.
project = 'my-project'
queue = 'refresh-queue'
location = 'europe-west2'
name = i['name'].replace(' ','')
url = f"https://europe-west2-my-project.cloudfunctions.net/endpoint?name={name}"
# Construct the fully qualified queue name.
parent = client.queue_path(project, location, queue)
# Construct the request body.
task = {
"http_request": { # Specify the type of request.
"http_method": tasks_v2.HttpMethod.GET,
"url": url, # The full url path that the task will be sent to.
}
}
# Use the client to build and send the task.
response = client.create_task(request={"parent": parent, "task": task})