1
votes

Is it possible to use the datalab packages in a jupyter set installed in a regular google cloud vm instances, not a datalab instance running a container? If so then I am not able to figure out how to set the credentials correctly to access google cloud services.

I have followed the directions at datalab to pip install datalab and enable the jupyter extension. It all appears to go well but I am unable to connect to any google cloud services, eg bigquery.

from google.datalab import Context
context = Context.default()
context.set_project_id('<the-proj-id>')

context._is_signed_in()
# True

context.credentials.service_account_email
# '<default-service-account-for-project>@developer.gserviceaccount.com'

The <default-service-account-for-project>@developer.gserviceaccount.com has editor rights to bigquery.

%load_ext google.datalab.kernel
set_datalab_project_id('<the-proj-id>')

however, the bq magic does not have permission to connect to bigquery

%bq datasets list
# HTTP request failed: Insufficient Permission

Is datalab designed to work outside the datalab docker evironment?

If so then how do you set the credentials correctly?

Many thanks.

1

1 Answers

1
votes

Yes you can use install and use datalab in a regular non datalab VM instance.

The permission error I had was with the "Cloud API access scopes" on the VM instance itself. The instance did not have Cloud API access to any cloud services. The default appears to be no access to cloud apis.

I enabled the bigquery access scope on VM instance details page and datalab worked as expected.

I would have discovered this sooner if I had tested bigquery at the command line of the vm instance and seen that it was failing with a permission error there indicating that the issue was not with datalab.