Is it possible to use the datalab packages in a jupyter set installed in a regular google cloud vm instances, not a datalab instance running a container? If so then I am not able to figure out how to set the credentials correctly to access google cloud services.
I have followed the directions at datalab to pip install datalab and enable the jupyter extension. It all appears to go well but I am unable to connect to any google cloud services, eg bigquery.
from google.datalab import Context
context = Context.default()
context.set_project_id('<the-proj-id>')
context._is_signed_in()
# True
context.credentials.service_account_email
# '<default-service-account-for-project>@developer.gserviceaccount.com'
The <default-service-account-for-project>@developer.gserviceaccount.com has editor rights to bigquery.
%load_ext google.datalab.kernel
set_datalab_project_id('<the-proj-id>')
however, the bq magic does not have permission to connect to bigquery
%bq datasets list
# HTTP request failed: Insufficient Permission
Is datalab designed to work outside the datalab docker evironment?
If so then how do you set the credentials correctly?
Many thanks.