We are deviating from datastore scheduled export mechanism (google suggested) and adopting to schedule datastore backup via cloud scheduler which will target HTTP cloud function. Here, we want use cloud function to export our datastore entities to certain storage bucket. The reason for this deviation from standard mechanism is that, we want to avoid duplicated non-app specific code in all our services.
As per docs, the managed export and import service is available only through Datastore mode Admin API (REST, RPC) and requests require OAuth 2.0 authorization.
In cloud function, to access datastore API https://datastore.googleapis.com/v1/projects/<APP ID>:export
, we require access_token
from scope https://www.googleapis.com/auth/datastore
.
In standard GAE application code, using python27 runtime, we can get access_token as per below example -
from google.appengine import app_identity
access_token, _ = app_identity.get_access_token('https://www.googleapis.com/auth/datastore')
But, cloud functions have Python37 runtime. So, importing google.appengine
gives error as error as error: ModuleNotFoundError: No module named 'google.appengine'
How can we get access_token
for the required scope? (any one of below scopes) -
Please suggest reference to Python code/document. Thanks.
oauth2client
is deprecated and should usegoogle.oauth2
instead but, still an issue. Cloud function throws an errormodule 'google' has no attribute 'oauth2'
– Prashant Jamkhande