1
votes

When I upload a larger image (3+ MB) to an AWS S3 bucket, only part of the image is being being saved to the bucket (about the top 10% of the image, the rest displaying as grey space). These images consistently show 256 KB size. There isn't any issue with smaller files.

Here's my code:

s3 = boto3.resource('s3') s3.Bucket(settings.AWS_MEDIA_BUCKET_NAME).put_object(Key=fname, Body=data)

...where data is binary data of image file.

No issues when files are smaller size, and in the S3 bucket the larger files all show as 256 KB.

I haven't been able to find any documentation about why this might be happening. Can someone please point out what I'm missing?

Thanks!

1
You should probably use multipart-uploads as described in the example here - Jason
Looked into this but there's nothing documented that makes me think that 3 MB of data should require a multipart upload. - Murcielago
How are you uploading to S3? Is it using the app server as a pass-through, or are you saving the file as a temp file before sending to S3? If the latter, can you confirm that you can upload the file entirely to your app server's storage? - Jason
Thanks for your help Jason. I've been testing uploading images to my django dev server on my local machine and I'm running into the same issue, which leads me to believe it's not a boto/s3 issue. The uploaded data is base64-encoded image file. I've noticed that there's a huge difference in len(b64_encoded_data) and len(b64_decoded_data), where decoded data is much smaller (much more than 75% smaller). Not sure what could be causing this or if this is the underlying issue. Any thoughts are appreciated. - Murcielago
I'd suggest you log the base64 encoding/decoding in the browser console, and test the encoding/decoding server-side using Python's base64 library. If you can print out all four versions of the data and compare. If the server is receiving a smaller data segment than the browser is pushing out, there can be an issue with the client-server transmission. - Jason

1 Answers

1
votes

I had the same issue and it took me hours to figure it out. I finally fixed it by creating a stream. This is my code:

const uploadFile = (filePath) => {
  let fileName = filePath;
  fs.readFile(fileName, (err, data) => {

  let body= fs.createReadStream(filePath);

   if (err) throw err;
   const params = {
      Bucket: 'bucketname', // pass your bucket name
      Key: fileName; 
      Body: body,
      ContentType: 'image/jpeg',
      ContentEncoding: 'base64',
   };
   s3.upload(params, function(s3Err, data) {
       if (s3Err) throw s3Err;
       console.log(`File uploaded successfully at ${data.Location}`);
       });
    });
};