Working through this guide: https://cloud.google.com/functions/docs/tutorials/pubsub
I ran into an issue where I need to read the messages from Pub/Sub in batches of 1000 per batch. I'll be posting messages in batches to a remote API from my Cloud function.
In short, 1000 messages needs to be read per invocation from Pub/Sub.
I've previously done something similar with Kinesis and Lambda using batch-size
parameter but have not found the similar configuration for Cloud function.
aws lambda create-event-source-mapping --region us-west-2 --function-name kinesis-to-bigquery --event-source <arn of the kinesis stream> --batch-size 1000 --starting-position TRIM_HORIZON
Function:
// Pub/Sub function
export function helloPubSub (event, callback) {
const pubsubMessage = event.data;
const name = pubsubMessage.data ? Buffer.from(pubsubMessage.data, 'base64').toString() : 'World';
console.log(`Hello, ${name}!`);
callback();
}
My question is if this is possible using Cloud function or if there exist other approaches to this problem.