1
votes

I'm looking for a python 3 example on how I would get an access token so I could import a csv file from GCS into Cloud SQL from a Google Cloud Function.

It's from a Cloud Function so the expectation is that the service account it runs under or the service account of the Cloud SQL instance would have access if given access, but that's not the case.

Response HTTP Response Body: {
 "error": {
 "code": 401,
 "message": "Request is missing required authentication credential. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.",
 "errors": [
 {
 "message": "Login Required.",
 "domain": "global",
 "reason": "required",
 "location": "Authorization",
 "locationType": "header"
 }
 ],
 "status": "UNAUTHENTICATED"
 }
}

Below is the code, am curious if anyone has some sample code on how I can get it to authenticate.

response = requests.post(
            url="https://www.googleapis.com/sql/v1beta4/projects/redacted-project/instances/redacted-instance/import",
            headers={"Content-Type": "application/json; charset=utf-8"
            },
            data=json.dumps({
                "importContext": {
                    "fileType": "CSV",
                    "csvImportOptions": {
                        "table": "service_data"
                    },
                    "uri": "gs://redacted-bucket/log/" + blob.name + "",
                    "database": "redacted-db"
                }
            })
        )
        print('Response HTTP Status Code: {status_code}'.format(status_code=response.status_code))
        print('Response HTTP Response Body: {content}'.format(content=response.content))
3

3 Answers

1
votes

You should use the google-api-python-client to construct a service for this API instead of trying to make a request directly. This will allow it to pick up the default service account for the Cloud Function:

from googleapiclient.discovery import build
service = build('sql', 'v1beta4')
...

More details here: https://github.com/googleapis/google-api-python-client/blob/master/docs/start.md

0
votes

1.From your Google Cloud Functions, get auth tokens by querying the metadata server assuming that your cloud function runs under default service account, which is App Engine Default service account and has the role Editor.

import requests
import json

METADATA_URL = 'http://metadata.google.internal/computeMetadata/v1/'
METADATA_HEADERS = {'Metadata-Flavor': 'Google'}
SERVICE_ACCOUNT = 'default'


def import_table(request):
    url = '{}instance/service-accounts/{}/token'.format(
        METADATA_URL, SERVICE_ACCOUNT)

    # Request an access token from the metadata server.
    r = requests.get(url, headers=METADATA_HEADERS)
    r.raise_for_status()

    # Extract the access token from the response.
    access_token = r.json()["access_token"]


    body = json.dumps({'importContext': {'fileType': 'CSV',
        'csvImportOptions': {'table': 'your_table'},
        'uri': 'gs://temprun/your_dump_file',
        'database': 'your_database'}})

    response = requests.post(
            url="https://www.googleapis.com/sql/v1beta4/projects/your_project/instances/your_sql_instance/import",
            headers={"Content-Type": "application/json; charset=utf-8",
                     "Authorization": "Bearer {}".format(access_token)
            },
            data=body)    

    return  str(response)



2.Using client libraries google-api-python-client:

def import_table(request):

    from googleapiclient.discovery import build
    service = build('sqladmin', 'v1beta4')

    body = {'importContext': {'fileType': 'CSV',
        'csvImportOptions': {'table': 'your_table'},
        'uri': 'gs://temprun/your_dump_file',
        'database': 'your_database'}}

    service.instances().import_(project='your_project', instance='your_instance', body=body).execute()

    return "Table was imported"

If successful, the response body contains an instance of Operation.

{'kind': 'sql#operation',
 'targetLink': 'https://sqladmin.googleapis.com/sql/v1beta4/projects/your-project/instances/instance',
 'status': 'PENDING',
 'user': 'youraccount,
 'insertTime': '2020-03-18T09:02:55.437Z',
 'operationType': 'IMPORT',
 'importContext': {'uri': 'gs://yourbucket/dumpfile',
  'database': 'yourdatabase',
  'kind': 'sql#importContext',
  'fileType': 'CSV',
  'csvImportOptions': {'table': 'sql-table}},
 'name': 'cdcd53d4-96fe-41cf-aee4-12cf6ec6394e',
 'targetId': 'instance_name',
 'selfLink': 'https://sqladmin.googleapis.com/sql/v1beta4/projects/project/operations/cdcd53d4-96fe-41cf-aee4-12cf6ec6394e',
 'targetProject': 'your-project'}
0
votes

From within Google Cloud Functions you can get auth tokens by querying the metadata server.

There is an easier option, however: use the Cloud SQL Client Library. This will automatically get auth tokens for you.

Both of these options will authenticate with the [email protected] service account. You may need to grant that account permissions if you are doing cross-project calls etc.