1
votes

Hi, I am using GSUTIL installed via GCLOUD SDK and gsutil as a command its working properly, i am able to create buckets and upload files to that bucket. When i try use the same via windows batch file(.bat), its failing with below errors,

Caught non-retryable exception while listing gs://sushanth-07081985/: ServiceException: 401 Anonymous caller does not have storage.objects.list access to sushanth-07081985.
CommandException: Caught non-retryable exception - aborting rsync

Below is my batch file

set gsutil="C:\Program Files (x86)\Google\Cloud SDK\google-cloud-sdk\platform\bundledpython\python.exe" "C:\Program Files (x86)\Google\Cloud SDK\google-cloud-sdk\platform\gsutil\gsutil"

echo Backing up Test Folder
%gsutil% -m rsync -n -r -d -x ".*node_modules.*^|.*\.git.*" ./TestFolder gs://sushanth-07081985

Tried below things after reading variour S.O posts

  1. Executed command gcloud config list --all and saw pass_credentials_to_gsutil as unset, so had set it as false

    gcloud config set pass_credentials_to_gsutil false

  2. Created a service account and tried

    gcloud auth activate-service-account --key-file codes-20180829-2c7c4b3e24df.json

  3. gsutil config -e (Backed up the .boto file as it said it is going to change it). Here gave the full path of the Service Account JSON file.

After this i was not able to run at all, so tried to undo everthing i had done and it worked as before

  1. replaced the .boto file which i had backed up earlier.
  2. gcloud init (reinitialized configuration)
  3. gcloud config set pass_credentials_to_gsutil true

I am using windows 10, basically i would like to setup a gsutil to use service account to upload files to Cloud storage via a batch job(.bat). It will be good to know any step by step approach.

Thanks

Updated : 25/12/2018 Is below approach is right ?

  1. Created a Service Account in IAM using the defaults and generated a p12 file
  2. Ran gsutil config -e command, it had asked for full path of p12 file[D:\BigData\16.GCP\GCS\Private\codes-20180829-d05f0ecb939d.p12] & SA email id(got it from IAM itself)
  3. It had generated a .boto file with values populated for gs_service_key_file[p12 filepath], gs_service_client_id[SA email id], gs_service_key_file_password[notasecret]
  4. Went to each buckets and manually added the SA set the role as StorageAdmin
  5. Had the env. variable BOTO_PATH set to C:\Users\Sushanth.boto

Now it works, would like to know is the above approach correct ?

1

1 Answers

1
votes

You should be invoking the gcloud wrapper script for gsutil instead:

C:\Program Files (x86)\Google\Cloud SDK\google-cloud-sdk\bin\gsutil

This script will pass your configured gcloud auth options through to the main entry point for gsutil (the one you were invoking before, located at ...\google-cloud-sdk\platform\gsutil\gsutil).

If you ever get these two mixed up, you can tell which one is the Cloud SDK entry point by running gsutil's version -l command for both paths. This command will print out information about your environment, including whether or not you invoked gsutil through the Cloud SDK and which .boto configs are being loaded:

> C:\path\to\google-cloud-sdk\platform\gsutil\gsutil version -l
[...]
using cloud sdk: False
pass cloud sdk credentials to gsutil: False
config path(s): C:\Users\bob\.boto
[...]