1
votes

I need to backup our Google Storage buckets. Versioning is not enough. I was thinking about:

  • backup to s3 - is there an automated bucket sync from GS->S3 or out-of-the-box solution for scheduled transfers between buckets?
  • backup to another GS bucket - in the same gc project, a coldline bucket "replica" with read-only privs to most users and some automated process to replicate/sync the data?
  • any other ideas?

thanks:)

2
Google cloud does not support any inbuilt service to use for this purpose, so best you can do is run gsutil rsync cron job from some machine to trigger this as suggested by @Mike. Also, I would recommend doing the backup on GS bucket if you have a large amount of data, as GS bucket to GS bucket is very fast. - Yogesh Patil
thanks @YogeshPatil. So I guess the best would be sync it with cron to bucket in different geolocation? Any idea if there is a fee for the traffic between two different coldline/nearline geolocations? - bartimar
Wait I think this Google cloud transfer service is an answer to your problem. cloud.google.com/storage/transfer. Let me know if it is one which solves your purpose, I will update the answer. :) - Yogesh Patil

2 Answers

3
votes

As mentioned in a comment, GCS Transfer is what you are looking for, at least for the part: "backup to another GS bucket".

From the doc:

Transfer data to your Cloud Storage buckets from Amazon Simple Storage Service (S3), HTTP/HTTPS servers, or other buckets. You can schedule one-time or daily transfers, and you can filter files based on name prefix and when they were changed.

2
votes

You could use gsutil rsync to do this:

gsutil -m rsync -rd gs://your-bucket s3://your-bucket

(similarly for syncing between GCS buckets).

You would need to set up a cron job or something similar to cause this to run periodically.