I'm trying to run a Spark (scala) job in a Dataproc cluster that needs to connect to a Pub/Sub pull subscription in the same project, but I'm getting the error message below. I presume the machines in my Dataproc cluster are missing a 'https://www.googleapis.com/auth/pubsub' scope.
Can I add additional authentication scopes to the machines of a Dataproc cluster?
Exception in thread "main" com.google.api.client.googleapis.json.GoogleJsonResponseException: 403 Forbidden
{
"code" : 403,
"errors" : [ {
"domain" : "global",
"message" : "Request had insufficient authentication scopes.",
"reason" : "forbidden"
} ],
"message" : "Request had insufficient authentication scopes.",
"status" : "PERMISSION_DENIED"
}
PS: It wouldn't be a problem to recreate the cluster if necessary.