I'm trying to create a new data transfer job in BigQuery using Python and google-cloud-bigquery-datatransfer but I run into the error:
google.api_core.exceptions.PermissionDenied: 403 User does not have sufficient permission: bigquery.transfers.update is required on project PROJECT_ID
I've included my code below which isn't terribly interesting. I am running this while setting GOOGLE_APPLICATION_CREDENTIALS=/path/to/secrets/service-account.json. This service account has Owner status because I thought that upping its privileges might get me past the permissions error.
Now I'm starting to think that this API doesn't work at all with a service account, or am I crazy? I see authorization_code and CheckValidAuth in the documentation but it says they're optional.
Does the Google BigQuery Data Transfer API work with service accounts?
import os
import boto3
from google.cloud import bigquery_datatransfer_v1
import google.protobuf.json_format
client = bigquery_datatransfer_v1.DataTransferServiceClient()
parent = client.project_path('PROJECT_ID')
transfer_config = {
"destination_dataset_id": "neilo",
"display_name": "NeilO Data Transfer Test",
"data_source_id": "amazon_s3",
"params": {
"destination_table_name_template": "test_table",
"data_path": "s3://bucket/path/to/files/*.csv.gz",
"access_key_id": os.environ['AWS_ACCESS_KEY_ID'],
"secret_access_key": os.environ['AWS_SECRET_ACCESS_KEY'],
"file_format": "CSV"
},
"schedule": "daily"
}
transfer_config = google.protobuf.json_format.ParseDict(
transfer_config, bigquery_datatransfer_v1.types.TransferConfig())
response = client.create_transfer_config(parent, transfer_config)
The problem is

bigquery.transfers.updateandbigquery.transfers.getpermissions? - Nick_Kh