5
votes

I recently learnt of the new storage tiers and reduced prices announced on the Google Cloud Storage platform/service.

So I wanted to change the default storage class for one of my buckets from Durable Reduced Availability to Coldline, as that is what is appropriate for the files that I'm archiving in that bucket.

I got this note though:

Changing the default storage class only affects objects you add to this bucket going forward. It does not change the storage class of objects that are already in your bucket.

Any advice/tips on how I can change class of all existing objects in the bucket (using Google Cloud Console or gsutil)?

5

5 Answers

13
votes

The easiest way to synchronously move the objects to a different storage class in the same bucket is to use rewrite. For example, to do this with gsutil, you can run:

gsutil -m rewrite -s coldline gs://your-bucket/**

Note: make sure gsutil is up to date (version 4.22 and above support the -s flag with rewrite).

Alternatively, you can use the new SetStorageClass action of the Lifecycle Management feature to asynchronously (usually takes about 1 day) modify storage classes of objects in place (e.g. by using a CreatedBefore condition set to some time after you change the bucket's default storage class).

0
votes

You could now use "Data Transfer" to change a storage class by moving your bucket objects to a new bucket.

Access this from the left panel of Storage.

enter image description here

0
votes

If you couldn't access to the gsutil console, as in Google Cloud Function environment because Cloud Functions server instances don't have gsutil installed. Gsutil works on your local machine because you do have it installed and configured there. For all these cases I suggest you to evaluate the update_storage_class() blob method in python. This method is callable when you retrieve the single blob (in other words it refers to your specific object inside your bucket). Here an example:

from google.cloud import storage

storage_client = storage.Client()

blobs = storage_client.list_blobs(bucket_name)

for blob in blobs:
    print(blob.name)
    print(blob.storage_class)

all_classes = ['NEARLINE_STORAGE_CLASS', 'COLDLINE_STORAGE_CLASS', 'ARCHIVE_STORAGE_CLASS', 'STANDARD_STORAGE_CLASS', 'MULTI_REGIONAL_LEGACY_STORAGE_CLASS', 'REGIONAL_LEGACY_STORAGE_CLASS']

new_class = all_classes[my_index]
update_storage_class(new_class)

References:

0
votes

To change the storage class from NEARLINE to COLDLINE, create a JSON file with the following content:

{
  "lifecycle": {
    "rule": [
      {
        "action": {
          "type": "SetStorageClass",
          "storageClass": "COLDLINE"
        },
        "condition": {
          "matchesStorageClass": [
            "NEARLINE"
          ]
        }
      }
    ]
  }
}

Name it lifecycle.json or something, then run this in your shell:

$ gsutil lifecycle set lifecycle.json gs://my-cool-bucket

The changes may take up to 24 hours to go through. As far as I know, this change will not cost anything extra.

0
votes

I did this:

gsutil -m rewrite -r -s <storage-class> gs://my-bucket-name/

(-r for recursive, because I want all objects in my bucket to be affected).