I have an array entryIdsWithImages that are like folder names in Google cloud storage from where I have to download all the images, i.e., I want to download all the images from these specific GCP paths. So, according to the @google-cloud/storage docs, I first retrieve all the files from these paths, then downloaded them into a temporary directory, which I later zip. However, I found that the images downloaded were having a size of 0B, whereas they're not in the actual storage. Where am I wrong?
await Promise.all(
entryIdsWithImages.map(async (entryId) => {
const prefix = `images/${uid}/${entryId}`;
// download the images for the entry.
const [files] = await bucket.getFiles({
prefix,
});
files.forEach(async (file) => {
// file.name includes the whole path to the file, thus extract only the innermost path, which will be the name of the file
const fileName = file.name.slice(
file.name.lastIndexOf('/') + 1
);
const imgTempFilePath = path.join(imageDirectory, fileName);
try {
await file.download({
destination: `${imgTempFilePath}.jpg`,
});
} catch (e) {
console.log(
`Error downloading the image at ${prefix}/${fileName}: `,
e
);
}
});
})
)
The version I'm using is: "@google-cloud/storage": "^4.7.0". I tried with the latest version, only to obtain the same results.
await storage.bucket(bucketName).file(srcFilename).download(destination: "foo.jpg"});? - user835611