3
votes

blob.upload_from_filename(source) gives the error

raise exceptions.from_http_status(response.status_code, message, >response=response) google.api_core.exceptions.Forbidden: 403 POST >https://www.googleapis.com/upload/storage/v1/b/bucket1-newsdata->bluetechsoft/o?uploadType=multipart: ('Request failed with status >code', 403, 'Expected one of', )

I am following the example of google cloud written in python here!

 from google.cloud import storage

 def upload_blob(bucket, source, des):
    client = storage.Client.from_service_account_json('/path')
    storage_client = storage.Client()
    bucket = storage_client.get_bucket(bucket)
    blob = bucket.blob(des)
    blob.upload_from_filename(source)

I used gsutil to upload files, which is working fine.
Tried to list the bucket names using the python script which is also working fine.
I have necessary permissions and GOOGLE_APPLICATION_CREDENTIALS set.

4

4 Answers

7
votes

This whole things wasn't working because I didn't have permission storage admin in the service account that I am using in GCP.

Allowing storage admin to my service account solved my problem.

1
votes

As other answers have indicated that this is related to the issue of permission, I have found one following command as useful way to create default application credential for currently logged in user.

Assuming, you got this error, while running this code in some machine. Just following steps would be sufficient:

  • SSH to vm where code is running or will be running. Make sure you are user, who has permission to upload things in google storage.
  • Run following command: gcloud auth application-default login
  • This above command will ask to create token by clicking on url. Generate token and paste in ssh console.

That's it. All your python application started as that user, will use this as default credential for storage buckets interaction.

Happy GCP'ing :)

0
votes

This question is more appropriate for a support case.

As you are getting a 403, most likely you are missing a permission on IAM, the Google Cloud Platform support team will be able to inspect your resources and configurations.

0
votes

This is what worked for me when the google documentation didn't work. I was getting the same error with the appropriate permissions.

import pathlib
import google.cloud.storage as gcs

client = gcs.Client()

#set target file to write to
target = pathlib.Path("local_file.txt")

#set file to download
FULL_FILE_PATH = "gs://bucket_name/folder_name/file_name.txt"

#open filestream with write permissions
with target.open(mode="wb") as downloaded_file:

        #download and write file locally 
        client.download_blob_to_file(FULL_FILE_PATH, downloaded_file)