I am writing a Spark Job to run on a DataProc cluster in Project A but the job itself will pull data from a BigQuery instance in Project B using the BigQuery Connector. I have owner privileges for both project, but the job is run using a service account. The response I'm getting in the stack trace is this:
{
"code" : 403,
"errors" : [ {
"domain" : "global",
"message" : "Access Denied: Table ABC:DEF.ghi: The user [email protected] does not have bigquery.tables.get permission for table ABC:DEF.ghi.",
"reason" : "accessDenied"
} ],
"message" : "Access Denied: Table ABC:DEF.ghi: The user [email protected] does not have bigquery.tables.get permission for table ABC:DEF.ghi."
}