0
votes
const imageResponse = await axios.get(url[0], {
               responseType: "arraybuffer",
              });
const buffer = Buffer.from(imageResponse.data, "utf-8");
const image = Media.addImage(doc, buffer);

I'm using the above code inside one loop that will execute 100 times because it has 100 images. Each image size is max 150kb. I deployed the cloud function with 256mb. I'm getting "Error: memory limit exceeded. Function invocation was interrupted".

Problem statement:

I need to add 250 images in word document. I'm getting memory limit exceeded error.

Q&A

Is there any way to get one image and add to word document, after that clearing the memory used by the image?

How to effectively use this plugin in firebase cloud function with cloud storage for images?

Environment:

Firebase Cloud Function (NodeJs)

Size : 256mb

Word Doc Generating Library : docx (https://docx.js.org/#/)

1
Why are you certain that clearing the memory used by the image will help? The buffers are probably already being garbage collected. It sounds like your document is just too big to fit in memory after you've added all these images to it.Doug Stevenson
Yeah you are right I should not clear the data. Is there any solution to solve this ? and the cloud function maximum waiting time for a request is only 540 seconds.Dinesh S
@DougStevenson Is there any effective way to achieve this use case using firebase cloud function?Dinesh S
Increase the memory allocated to the function.Doug Stevenson
What if multiple user use this function simultaneously?Dinesh S

1 Answers

1
votes

For the kind of scenario you are describing, as Doug mentions, you should consider increasing your resources to better handle the requests to your functions.

You can set the memory using the flag memory using the gcloud command available for deploy your functions, for example:

gcloud beta functions deploy my_function --runtime=python37 --trigger-event=providers/cloud.firestore/eventTypes/document.write --trigger-resource=projects/project_id/databases/(default)/documents/messages/{pushId}
--memory=AmountOfMemory

I recommend you take a look at the best practices for cloud functions document where is explained:

"Local disk storage in the temporary directory is an in-memory filesystem. Files that you write consume memory available to your function, and sometimes persist between invocations. Failing to explicitly delete these files may eventually lead to an out-of-memory error and a subsequent cold start."

For have a better perspective about how Cloud functions manage the requests, check this document where is mentioned:

"Cloud Functions handles incoming requests by assigning them to instances of your function. Depending on the volume of requests, as well as the number of existing function instances, Cloud Functions may assign a request to an existing instance or create a new one

Each instance of a function handles only one concurrent request at a time. This means that while your code is processing one request, there is no possibility of a second request being routed to the same instance. Thus the original request can use the full amount of resources (CPU and memory) that you requested."