1
votes

I'm running a TensorFlow model on Google ML Engine. When the model training is finished, I want to store a JSON string with the results to Datastore. For this, I am using the following:

from gcloud import datastore

def put_json_into_datastore(json_str, project_id, entity_type):
    """
    Store json string in Datastore
    """
    # Instantiate the client to the project
    datastore_client = datastore.Client(project_id)
    # The name/ID for the new entity
    name = str(datetime.datetime.now())
    # The Cloud Datastore key for the new entity
    entity_key = datastore_client.key(entity_type, name)
    # Prepare the new entity
    entity = datastore.Entity(key=entity_key)
    # Get the json string into the entity
    entity.update(json_str)
    # Put the entity into Datastore
    datastore_client.put(entity)

Although, I am getting the error 'Forbidden: 403 Request had insufficient authentication scopes.' Here's the full error trace:

Traceback (most recent call last): File "/usr/lib/python2.7/runpy.py", line 162, in _run_module_as_main "main", fname, loader, pkg_name) File "/usr/lib/python2.7/runpy.py", line 72, in _run_code exec code in run_globals File "/root/.local/lib/python2.7/site-packages/trainer/train.py", line 243, in FLAGS.entity_type) File "/root/.local/lib/python2.7/site-packages/trainer/data_helpers.py", line 253, in put_json_into_datastore datastore_client.put(entity) File "/usr/local/lib/python2.7/dist-packages/gcloud/datastore/client.py", line 329, in put self.put_multi(entities=[entity]) File "/usr/local/lib/python2.7/dist-packages/gcloud/datastore/client.py", line 355, in put_multi current.commit() File "/usr/local/lib/python2.7/dist-packages/gcloud/datastore/batch.py", line 260, in commit self._commit() File "/usr/local/lib/python2.7/dist-packages/gcloud/datastore/batch.py", line 243, in _commit self.project, self._commit_request, self._id) File "/usr/local/lib/python2.7/dist-packages/gcloud/datastore/connection.py", line 342, in commit _datastore_pb2.CommitResponse) File "/usr/local/lib/python2.7/dist-packages/gcloud/datastore/connection.py", line 124, in _rpc data=request_pb.SerializeToString()) File "/usr/local/lib/python2.7/dist-packages/gcloud/datastore/connection.py", line 98, in _request raise make_exception(headers, error_status.message, use_json=False) Forbidden: 403 Request had insufficient authentication scopes.

Do I need to grant access somewhere for the ML engine to access Datastore?

1
dumb question but just making sure...is the datastore and cloud ML engine job within the same project?T.Okahara
@T.Okahara yap, same project. I'm able to save files through ML engine to Storage, although am unable to access Datastore.Filipe

1 Answers

3
votes

The Cloud ML service doesn't execute with permissions sufficient to access Datastore. One way around this would be to upload credentials (e.g a json service account key file) for a service account with access to Cloud Datastore. You could then use that to obtain credentials capable of accessing Datastore.