5
votes

I created a service account and assigned these roles:

Owner
Storage Admin
Storage Object Admin
Tester

Tester is a role I created for learning purposes with these permissions:

storage.buckets.create
storage.buckets.delete
storage.buckets.get
storage.buckets.getIamPolicy
storage.buckets.list
storage.buckets.setIamPolicy
storage.buckets.update
storage.objects.create
storage.objects.delete
storage.objects.get
storage.objects.getIamPolicy
storage.objects.list
storage.objects.setIamPolicy
storage.objects.update
...

I know I'm unnecessarily overdoing permissions with these roles; but, this is just for testing purposes.

Considering the bucket only contains a single file and the account has the corresponding permissions, the Python code below should work (Running on my local computer):

from google.cloud import storage


if __name__ == '__main__':
    storage_client = storage.Client()
    bucket = storage_client.bucket('my-bucket-name')
    blobs = bucket.list_blobs()
    for blob in blobs:
        print(blob.name)

But it doesn't:

Traceback (most recent call last):
  File "gcloud/test.py", line 8, in <module>
    for blob in blobs:
  File "/home/user/.local/lib/python3.6/site-packages/google/api_core/page_iterator.py", line 212, in _items_iter
    for page in self._page_iter(increment=False):
  File "/home/berkay/.local/lib/python3.6/site-packages/google/api_core/page_iterator.py", line 243, in _page_iter
    page = self._next_page()
  File "/home/user/.local/lib/python3.6/site-packages/google/api_core/page_iterator.py", line 369, in _next_page
    response = self._get_next_page_response()
  File "/home/user/.local/lib/python3.6/site-packages/google/api_core/page_iterator.py", line 419, in _get_next_page_response
    method=self._HTTP_METHOD, path=self.path, query_params=params
  File "/home/user/.local/lib/python3.6/site-packages/google/cloud/_http.py", line 421, in api_request
    raise exceptions.from_http_response(response)
google.api_core.exceptions.Forbidden: 403 GET LINK: USER does not have storage.objects.list access to BUCKET.

The bucket uses uniform bucket-level access control. The service account I'm using is a member of this bucket and it inherits this membership from:

Storage Admin
Storage Object Admin
Tester

Can someone explain me the reason behind this behavior?

Thanks

4
Did you switch this bucket to "uniform bucket-level access control" or created it like this?Stefan G.
@StefanG. Both cases won't work if I don't assign permissions manually (gsutil acl ch -u). Why do I have to manually give permission if that user already inherits the roles? For example serviceuser1 has already inherited the role Storage Object Admin. But I still have to click on "Add Member" and give that role manually to make it work.deebug
what account is gcloud using? gcloud auth listDUDANF
@mirana I have 2 credentialed accounts: One is my gmail account, the other is the service account which is the active one. I created the key for the service user via gcloud iam service-accounts keys create and activated with gcloud auth activate-service-account.deebug
@mirana Okay, I deleted the old service account, created a new one with only the Storage Object Admin role, created key for it, set account and activated. For now, everything is working. Thank you :) Please submit your comment as answer.deebug

4 Answers

6
votes

I personally believe in development/testing there is no need to shy away from over-granting roles. But if you are definitely giving multiple roles, may as well give an admin role instead of multiple smaller ones (since essentially they are both doing the same job, but with lesser roles)

For your specific problem here, I would suggest

  1. Delete your old service account
  2. Create a new service account and grant the roles of storage.admin and storage.object.admin
  3. Use this service account

There is a similar post on SO, and that seemed to be resolved in a similar way.

For future readers: If problem still persists, completely uninstall gcloud-sdk and reinstall (using this link) with the latest version.

2
votes

That is why I asked if you switched this bucket to uniform bucket-level access control or created it. My theory is that you switched it to uniform bucket level and triggered this disclaimer.

Caution: If you enable uniform bucket-level access, you revoke access from users who gain their access solely through object ACLs. Be sure that you read considerations when migrating an existing bucket prior to enabling uniform bucket-level access.

That's why it is working when you are adding the role manually.

You can read more about how uniform bucket-level access permisions work, here.

Here is more relevant information to what was happening.

Additionally, if you enable uniform bucket-level access as part of creating a new bucket, the bucket automatically receives additional Cloud IAM roles. This behavior maintains the permissioning that objects inherited from the bucket's default object ACLs. If you enable uniform bucket-level access on an existing bucket, you must apply any such roles manually; you may want to apply a different set of roles if you have changed the bucket's default object ACLs.

Also this which I understand it as an explanation of the error you we're getting.

Once enabled, the following ACL functionality ceases:

Requests to set, read, or modify bucket and object ACLs fail with 400 Bad Request errors.

Hope this helps.

0
votes

this worked for me:

gsutil defacl ch -u \
    <project-number-compute>@developer.gserviceaccount.com:OWNER \
    gs://bucket

see more: https://cloud.google.com/storage/docs/gsutil/commands/defacl

0
votes

I hope it's useful to note that this error will also come up if you have lost your config and initialization settings, as happened to me when I had to do a clean OS install.

In my case, I just had to run gcloud auth login and gcloud init (again), following the prompts and selecting option 1:

[1] Re-initialize this configuration [default] with new settings