0
votes

My step:

  1. Get Authorization code:I use this URL to get auth code

    https://www.gstatic.com/bigquerydatatransfer/oauthz/auth?clientId=<id from list data sources>&scope=https://www.googleapis.com/auth/adwords%20https://www.googleapis.com/auth/bigquery

  2. Create a service account with Project Owner

  3. Execute sample code:
def run_quickstart():
    from google.cloud import bigquery_datatransfer_v1
    from google.protobuf.struct_pb2 import Struct

    client = bigquery_datatransfer_v1.DataTransferServiceClient()

    project = <project-id>
    parent = client.location_path(project, 'us')
    params = Struct()
    params.update({
        "customer_id": <customer-id>
    })

    transfer_config = {
        "destination_dataset_id": "test",
        "display_name": "test",
        "data_source_id": "adwords",
        "params": params
    }

    authorization_code = <authorization_code from step 1>
    response = client.create_transfer_config(parent, transfer_config, authorization_code)
    print(response)

if __name__ == '__main__':
    run_quickstart()

Expect:Data transfer create But I got the error:

Traceback (most recent call last):
  File "/project/lib/python3.7/site-packages/google/api_core/grpc_helpers.py", line 57, in error_remapped_callable
    return callable_(*args, **kwargs)
  File "/project/lib/python3.7/site-packages/grpc/_channel.py", line 826, in __call__
    return _end_unary_response_blocking(state, call, False, None)
  File "/project/lib/python3.7/site-packages/grpc/_channel.py", line 729, in _end_unary_response_blocking
    raise _InactiveRpcError(state)
grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
    status = StatusCode.INVALID_ARGUMENT
    details = "Request contains an invalid argument."
    debug_error_string = "{"created":"@1581747953.387292000","description":"Error received from peer ipv6:[2404:6800:4012:1::200a]:443","file":"src/core/lib/surface/call.cc","file_line":1056,"grpc_message":"Request contains an invalid argument.","grpc_status":3}"
>

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "app.py", line 49, in <module>
    run_quickstart()
  File "app.py", line 40, in run_quickstart
    parent, transfer_config, authorization_code)
  File "/project/lib/python3.7/site-packages/google/cloud/bigquery_datatransfer_v1/gapic/data_transfer_service_client.py", line 563, in create_transfer_config
    request, retry=retry, timeout=timeout, metadata=metadata
  File "/project/lib/python3.7/site-packages/google/api_core/gapic_v1/method.py", line 143, in __call__
    return wrapped_func(*args, **kwargs)
  File "/project/lib/python3.7/site-packages/google/api_core/retry.py", line 286, in retry_wrapped_func
    on_error=on_error,
  File "/project/lib/python3.7/site-packages/google/api_core/retry.py", line 184, in retry_target
    return target()
  File "/project/lib/python3.7/site-packages/google/api_core/timeout.py", line 214, in func_with_timeout
    return func(*args, **kwargs)
  File "/project/lib/python3.7/site-packages/google/api_core/grpc_helpers.py", line 59, in error_remapped_callable
    six.raise_from(exceptions.from_grpc_error(exc), exc)
  File "<string>", line 3, in raise_from
google.api_core.exceptions.InvalidArgument: 400 Request contains an invalid argument.

I couldn't find which step is wrong. Please kind to help me to figure it out.

--- Update 02-19

@mk_sta

I remove authorization_code then the API could create a transfer, but the transfer couldn't work. It will get an error like this screenshot.

enter image description here

I thought the error is caused authorization_code is used to connect to google ads. Is your transfer work fine?

Reference:

1
Have you tried to change authentication method against BigQuery data Transfer API, i.e. set up GOOGLE_APPLICATION_CREDENTIALS environment variable instead of using OAuth2 authorization code? Is it remain the same issue? - Nick_Kh
@mk_sta Yes, I have already set environment GOOGLE_APPLICATION_CREDENTIALS - Rukeith

1 Answers

0
votes

After revising your code snippet, I've noticed some points to mention:

  1. According to the documentation pages BigQuery Data Transfer Service requires initial service account to own bigquery.admin role in order to inherit bigquery.datasets.update permission:

All users that will create transfers must be granted the bigquery.admin predefined Cloud IAM role. The bigquery.admin role includes the following BigQuery Data Transfer Service permissions:

bigquery.transfers.update

bigquery.transfers.get

  1. Whenever you invoke create_transfer_config function, authorization_code is an optional parameter, that might be skipped, launching the different authentication model for transport configuration, depending on the particular user case.

I've tried to execute the above code using service account key associated with my GCP project:

export GOOGLE_APPLICATION_CREDENTIALS="<some_path>/[FILE_NAME].json"

I've made a few adjustments to the source code in order to get rid of authorization_code in the transfer config:

def run_quickstart():
    from google.cloud import bigquery_datatransfer_v1
    from google.protobuf.struct_pb2 import Struct

    client = bigquery_datatransfer_v1.DataTransferServiceClient()

    project = <project-id>
    parent = client.location_path(project, 'us')
    params = Struct()
    params.update({
        "customer_id": <customer-id>
    })

    transfer_config = {
        "destination_dataset_id": "test",
        "display_name": "test",
        "data_source_id": "adwords",
        "params": params
    }

    response = client.create_transfer_config(parent, transfer_config)
    print(response)

if __name__ == '__main__':
    run_quickstart()

And that worked for me and I've successfully received response body:

name: "projects/.../..."
destination_dataset_id: "test"
display_name: "test"
update_time {
  seconds: 1582044242
  nanos: 441442000
}
data_source_id: "adwords"
next_run_time {
  seconds: 1581897600
}
params {
  fields {
    key: "customer_id"
    value {
      string_value: "XXXX"
    }
  }
}
user_id: <user_id>
dataset_region: "us"