I'm implementing a serverless project on google cloud. Users will upload 4GB sized zip files on a cloud storage bucket. (Users compress files on their own before uploading) They need to be uncompressed before the contents can be processed.
I find some solutions for small files:
- download the zip file from the storage bucket to a cloud function
- unzip in the function
- upload the unzipped files to the storage bucket
Here, the file downloaded by function is stored in a memory space allocated to the function. However, the maximum memory for cloud functions is 2GB which is too small for me.
In the worst case, I would need to use VMs but that would be expensive.
Are there any other ways around? Preferred language is python.