We have large amount of data where we need to update documents based on a status. We will writing in batch of 500 chunks and wondering how many max records we can commit with a single trigger invocation?
Our trigger is a pubsub trigger in firebase cloud functions.
We see there is a limit of 540secs per invocation so would like to know how many max documents we can write in batches?
Update : Adding usecase
I have an event collection(Events) where users can subscribe for each event happening in a country.
Users have an api to see how many events they have subscribed to. They have query flags like is the event Live/Finished/Upcoming.
As I can't save list of user array who subscribed for an event in the event document(assuming subscribers can go beyond the document limit when stored), I maintained a separate sub-collection under users collection. Ex : users/user-id/subscribedevents
Event document's status (Live/Finished/Upcoming), i'm updating from a cron job which will be running every minute. This is because I can't apply filters with two different fields (startDate & endDate).
When ever an event's status changes, i need to update in subscribedevents subcollection (which is under user's collection).
As I will be updating all the subscribedevents subcollection entries, I want to do it in batches.
Hope the usecase gives some clarity on where it is applied. As firestore is designed for scale, wondering how others are handling this scenario as its very common.