I'm trying to write some data to a BigQuery table from my Dataflow Pipeline, but the writes are failing with the following error message in stackdriver :
{
"error": {
"errors": [
{
"domain": "global",
"reason": "required",
"message": "Login Required",
"locationType": "header",
"location": "Authorization"
}
],
"code": 401,
"message": "Login Required"
}
}
I've already tried authenticating my gcloud CLI tool by using gcloud auth application-default login and gcloud auth login prior to running the Dataflow Pipeline from my local machine.
The API for BigQuery is also enabled in my Google Cloud console, and this entire setup worked just fine few days ago.
What I think is happening here is that my Dataflow Pipeline doesn't have enough privileges to write to my BQ Table, but I can't find a way to fix this in the docs.
Would appreciate any leads on this.
GOOGLE_APPLICATION_CREDENTIALSbefore kicking off your pipeline/code? cloud.google.com/docs/authentication/… - Graham Polley