3
votes

I'm trying to run a Spark (scala) job in a Dataproc cluster that needs to connect to a Pub/Sub pull subscription in the same project, but I'm getting the error message below. I presume the machines in my Dataproc cluster are missing a 'https://www.googleapis.com/auth/pubsub' scope.

Can I add additional authentication scopes to the machines of a Dataproc cluster?

Exception in thread "main" com.google.api.client.googleapis.json.GoogleJsonResponseException: 403     Forbidden
{
  "code" : 403,
  "errors" : [ {
  "domain" : "global",
  "message" : "Request had insufficient authentication scopes.",
  "reason" : "forbidden"
} ],
"message" : "Request had insufficient authentication scopes.",
"status" : "PERMISSION_DENIED"
}

PS: It wouldn't be a problem to recreate the cluster if necessary.

1

1 Answers

3
votes

Custom service account scopes are currently specifiable in the Cloud Dataproc API, but not in the Cloud SDK or the Developer Console. They should be exposed in the Cloud SDK in the next week or so.

In any case you will need to recreate the cluster with the scope specified.