7
votes

I am writing a python function which uses service account credentials to call the Google cloudSQLAdmin api to export a database to a bucket.

The service account has been given project owner permissions, and the bucket has permissions set for project owners. The sqlAdmin api has been enabled for our project.

Python code:

from google.oauth2 import service_account
from googleapiclient.discovery import build
import googleapiclient
import json

def main():
    SCOPES = ['https://www.googleapis.com/auth/sqlservice.admin', 'https://www.googleapis.com/auth/cloud-platform', 'https://www.googleapis.com/auth/devstorage.full_control']
    SERVICE_ACCOUNT_FILE = './creds/service-account-credentials.json'
    PROJECT = "[REDACTED]"
    DB_INSTANCE = "[REDACTED]"
    BUCKET_PATH = "gs://[REDACTED]/[REDACTED].sql"
    DATABASES = [REDACTED]
    BODY = { # Database instance export request.
    "exportContext": { # Database instance export context. # Contains details about the export operation.
      "kind": "sql#exportContext", # This is always sql#exportContext.
      "fileType": "SQL", # The file type for the specified uri.
          # SQL: The file contains SQL statements.
          # CSV: The file contains CSV data.
      "uri": BUCKET_PATH, # The path to the file in Google Cloud Storage where the export will be stored. The URI is in the form gs://bucketName/fileName. If the file already exists, the requests succeeds, but the operation fails. If fileType is SQL and the filename ends with .gz, the contents are compressed.
      "databases": DATABASES,
    },
  }

    credentials = service_account.Credentials.from_service_account_file(SERVICE_ACCOUNT_FILE, scopes=SCOPES)
    sqladmin = googleapiclient.discovery.build('sqladmin', 'v1beta4', credentials=credentials)
    response = sqladmin.instances().export(project=PROJECT, instance=DB_INSTANCE, body=BODY).execute()
    print(json.dumps(response, sort_keys=True, indent=4))

Running this code nets the following error:

Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "[REDACTED]/main.py", line 47, in hello_pubsub
    response = sqladmin.instances().export(project=PROJECT, instance=DB_INSTANCE, body=BODY).execute()
  File "/usr/local/lib/python3.7/site-packages/googleapiclient/_helpers.py", line 130, in positional_wrapper
    return wrapped(*args, **kwargs)
  File "/usr/local/lib/python3.7/site-packages/googleapiclient/http.py", line 851, in execute
    raise HttpError(resp, content, uri=self.uri)
googleapiclient.errors.HttpError: <HttpError 403 when requesting https://www.googleapis.com/sql/v1beta4/projects/[REDACTED]/instances/[REDACTED]/export?alt=json returned "The service account does not have the required permissions for the bucket.">

I have tried this across 2 GCP projects, with multiple service accounts with varying permissions.

Related questions: Access denied for service account (permission issue?) when importing a csv from cloud storage to cloud sql - This issue was caused by incorrect permissions, which shouldn't be the case here as the account has project owner permissions

1
I finally figured this one out - Google actually doesn't mention this in their documentation but each SQL instance has a corresponding service account. It's using that service account to export the data, so you must give it access to the target container.Lucy Nunley

1 Answers

25
votes

Google Cloud uses system of identity and access management for managing resources: IAM.

Each Cloud SQL instance uses corresponding Service Account that have permissions. To find your Cloud SQL service account name go to:

Console > SQL > Instance Name > Service account

With this Service Account name, you can grant access control permission for the bucket, selecting from: Storage Admin, Storage Object Admin, Storage Object Creator, Storage Object Viewer.

Following the least privilege principle, you will only need to add: Storage Object Creator, to allow export to the Cloud Storage bucket.

Detailed storage access roles descriptions are included in the documentation.