0
votes

I am trying to automate backup uploads of db's to gcp buckets.

I am now working on a virtual machine on Google cloud. I want to create an SQL job that will run a cmd command that uploads the backup file to bucket.

When I manually run the command in cmd it works, but when I run it through sql agent it doesn't. The job returns success, but it doesn't really upload it.

The first issue was that the Google service that runs the machine didn't have permission to write into the bucket; that I solved. I also created a boto file under my user with ouath using gutils.

I tried to add the -D flag for gsutils to debug, if i understand correctly it can't find a config file. The sql agent job is set that I am the owner. The sql agent service is run by a network service.

Any ideas why it won't upload the data?

From what i understand the gsutil is nbot set to the correct projectID! Ths gcloud is set to the correct projectid. I don't know how to set gsutil to my projectid. I looked at boto file and can't find projectID configuration.

This is output of debug from then command:

Command being run: C:\Program Files (x86)\Google\Cloud SDK\google-cloud-sdk\platform/gsutil\gsutil -o GSUtil:default_project_id=xxxx-246709 -o GoogleCompute:service_account=default -D -m rsync -r D:\backup\ gs://xxx config_file_list: [u'No config found'] config: [('debug', '0'), ('working_dir', '/mnt/pyami'),

1

1 Answers

0
votes

I also created a boto file under my user with ouath using gsutil

Where is that file? If it's at <path to your home directory>\.boto, the Boto library will automatically load it. If it's not in one of the known places that Boto checks runtime, you'll need to set the BOTO_CONFIG environment variable before running gsutil to tell Boto where to find the config. You can find the path to your config by running gsutil -D version and looking at the config_file_list line.