I have a cloud function triggered by a storage bucket on google.storage.object.finalize. In this function, I download the file which triggered the function and perform some transforms on the data. If this file is ~5KB, the function executes as expected. If the file is ~200MB, the file.download callback never runs and Stackdriver shows nothing but Function execution took 14 ms, finished with status: 'ok'.
I understand that there is a 10MB limit on files uploaded via HTTP to HTTP-triggered functions but this function is triggered by Cloud Storage. The answer given to this question states that the 10MB limit is not imposed on storage-triggered functions so I suspect a possible timeout issue.
The function is set to a 2GB memory limit, 5 minute timeout, and all buckets/functions are in the same region. Surely this would be enough resources to transfer and process a 200MB file (profiling locally shows the process completing in a few seconds while remaining under 512MB memory).
const { Storage } = require('@google-cloud/storage');
const storage = new Storage();
const uploadBucket = storage.bucket("<bucket name>");
module.exports = (data, context) => {
console.log('Getting file ref'); // Logged regardless of file size
const file = uploadBucket.file(<file name>);
file.download(function (err, buffer) {
console.log('Starting file processing');
// With a smaller file, this callback runs as expected.
// With larger files, this code is never reached.
});
};
Do I have an incorrect understanding of the bandwidth available between the function and the storage bucket or does this suggest another issue?
file.download()method that I forgot to consider the termination of the Cloud Function itself. If you'd like to leave an answer, I'll be glad to mark as accepted. - user2864874