0
votes

I have a Google Ubuntu Instance successfully authenticated / connected to a Google Bucket and an Amazon S3 bucket.

I can copy / move / edit small files / folder in either bucket using a ubuntu instance and I can upload direct to the AWS web interface

I have about 4 files in the google bucket which I am trying to either RSYNC or GSUTIL CP from Google -> AWS

Small files go no problem, but anything over a few hundred megabytes fails with ServiceException 400.

Here are the command I have tried

gsutil -o GSUtil:parallel_composite_upload_threshold=150M cp -r gs://(source bucket)/ s3://(destination bucket)/

or

gsutil rsync -d -r gs://(source bucket) s3://(destination bucket)

Can anyone advise if they have had this issue before and the best way to push Google Bucket -> AWS without downloading / uploading.. i.e direct bucket to bucket

1

1 Answers

0
votes

From rsync - Synchronize content of two buckets/directories  |  Cloud Storage  |  Google Cloud:

If you are synchronizing a large amount of data between clouds you might consider setting up a Google Compute Engine account and running gsutil there. Since cross-provider gsutil data transfers flow through the machine where gsutil is running, doing this can make your transfer run significantly faster than running gsutil on your local workstation.

Thus, using this command actually does download then upload the data.

Since you say that you only have 4 files, it would probably be just as easy to download each of them from Google Cloud, then upload them to Amazon S3 with the AWS Command-Line Interface (CLI).