My use case is to copy a file from google cloud storage to s3, so I'm trying out with google cloud function which is triggered after an event in storage bucket.
Snippet of the code.
var s3Stream = require('s3-upload-stream')(new AWS.S3());
var gcs = require('@google-cloud/storage')();
exports.hellogcs = function (event, callback) {
var bucket = gcs.bucket(event.data.bucket);
var remoteReadStream = bucket.file(file.name).createReadStream();
var uploadStream = s3Stream.upload({
"Bucket": 'my bucket',
"Key": 'parition1/'+event.data.name
});
console.log('writing into S3 stream');
remoteReadStream.pipe(uploadStream);
};
In the log I can see "writing into S3 stream" but there is no file in the s3 bucket.
Also i just listed files of my s3 bucket in the google cloud function. And the listing is perfect.
Just wanted to find why this pipe is not working and any other approaches to handle through google functions. PS: I'm quite new to node.js, so please correct me if there any issues.
Edit:- What I noticed only is only small <3mb files are getting copied but big files are not getting copied.