0
votes

I am trying to copy file form my local system to aws s3 bucket using the following copy command:

aws s3 cp folder/ s3://xxx/yyy/folder --recursive

Only the smaller size files get copied but the larger file size (eg:5MB) files as are copying. I am receiving error like connection reset by peer, write operation timed-out.

I also looked for the link:

https://github.com/aws/aws-cli/issues/634

But nothing works. Please help me to sort out this. Thanks in advance.

My version:

aws --version:

aws-cli/1.9.2 
Python/2.7.3 
Linux/3.5.0-27-generic 
botocore/1.3.2
2

2 Answers

0
votes

you can use the following parameter

--page-size (integer) The number of results to return in each response to a list operation. The default value is 1000 (the maximum allowed). Using a lower value may help if an operation times out.

setting to 100 should help resolve your issue

0
votes

You can alternatively use Minio client aka mc for same. mc mirror command can do this, it is open source. Minio client will take care of multipart upload natively and incase of network disconnection it will start uploading from where it had left before disconnecting.

$ mc mirror folder S3alias/folder

Feel free to check https://docs.minio.io/docs/minio-client-quick-start-guide for detailed information.

Hope it helps.

Disclaimer: I work for Minio