59
votes

Update: For future reference, Amazon have now updated the documentation from what was there at time of asking. As per @Loren Segal's comment below:-

We've corrected the docs in the latest preview release to document this parameter properly. Sorry about the mixup!


I'm trying out the developer preview of the AWS SDK for Node.Js and want to upload a zipped tarball to S3 using putObject.

According to the documentation, the Body parameter should be...

Body - (Base64 Encoded Data)

...therefore, I'm trying out the following code...

var AWS = require('aws-sdk'),
    fs = require('fs');

// For dev purposes only
AWS.config.update({ accessKeyId: 'key', secretAccessKey: 'secret' });

// Read in the file, convert it to base64, store to S3
fs.readFile('myarchive.tgz', function (err, data) {
  if (err) { throw err; }

  var base64data = new Buffer(data, 'binary').toString('base64');

  var s3 = new AWS.S3();
  s3.client.putObject({
    Bucket: 'mybucketname',
    Key: 'myarchive.tgz',
    Body: base64data
  }).done(function (resp) {
    console.log('Successfully uploaded package.');
  });

});

Whilst I can then see the file in S3, if I download it and attempt to decompress it I get an error that the file is corrupted. Therefore it seems that my method for 'base64 encoded data' is off.

Can someone please help me to upload a binary file using putObject?

3

3 Answers

55
votes

You don't need to convert the buffer to a base64 string. Just set body to data and it will work.

26
votes

Here is a way to send a file using streams, which might be necessary for large files and will generally reduce memory overhead:

var AWS = require('aws-sdk'),
    fs = require('fs');

// For dev purposes only
AWS.config.update({ accessKeyId: 'key', secretAccessKey: 'secret' });

// Read in the file, convert it to base64, store to S3
var fileStream = fs.createReadStream('myarchive.tgz');
fileStream.on('error', function (err) {
  if (err) { throw err; }
});  
fileStream.on('open', function () {
  var s3 = new AWS.S3();
  s3.putObject({
    Bucket: 'mybucketname',
    Key: 'myarchive.tgz',
    Body: fileStream
  }, function (err) {
    if (err) { throw err; }
  });
});
10
votes

I was able to upload my binary file this way.

var fileStream = fs.createReadStream("F:/directory/fileName.ext");
var putParams = {
    Bucket: s3bucket,
    Key: s3key,
    Body: fileStream
};
s3.putObject(putParams, function(putErr, putData){
    if(putErr){
        console.error(putErr);
    } else {
        console.log(putData);
    }
});