3
votes

My use case is to copy a file from google cloud storage to s3, so I'm trying out with google cloud function which is triggered after an event in storage bucket.

Snippet of the code.

    var s3Stream = require('s3-upload-stream')(new AWS.S3());
    var gcs = require('@google-cloud/storage')();
 exports.hellogcs = function (event, callback) {
    var bucket = gcs.bucket(event.data.bucket);
    var remoteReadStream = bucket.file(file.name).createReadStream();
    var uploadStream = s3Stream.upload({
          "Bucket": 'my bucket',
          "Key": 'parition1/'+event.data.name
        });
    console.log('writing into S3 stream');
    remoteReadStream.pipe(uploadStream);
 };

In the log I can see "writing into S3 stream" but there is no file in the s3 bucket.

Also i just listed files of my s3 bucket in the google cloud function. And the listing is perfect.

Just wanted to find why this pipe is not working and any other approaches to handle through google functions. PS: I'm quite new to node.js, so please correct me if there any issues.

Edit:- What I noticed only is only small <3mb files are getting copied but big files are not getting copied.

1
Where are the callbacks? Is this code intentionally synchronous? - Michael - sqlbot
This is not complete code just a snippet. There is a calllback() function as suggested in this link. cloud.google.com/functions/docs/writing/background - Devesh S
Did you solve the problem? - Cotrariello

1 Answers

0
votes

Google Cloud Functions doesn't allow outbound network calls for hosts other than google in the Spark plan. It might be the problem if you haven't changed to premium plans.