1
votes

I am working on TensorFlow model on my local machine which run properly, now I want to deploy the model on Google Cloud Platform. But while the data is on the Google Cloud Storage bucket.

So my question are as follows:

  1. How to access the Google Cloud Storage bucket to run my model on local machine and also on Google Cloud Platform console.
  2. The Google Cloud Storage bucket's data are within multiple files, so how to import multiple files together using Python.

Thank you in advance.

1

1 Answers

0
votes
  1. You can use gsutil to access Google Cloud Storage bucket, and copy file to vm's disk.

    gsutil cp gs://your-bucket/*

  2. Use

from google.cloud import storage

# create storage client
storage_client = storage.Client.from_service_account_json('your_credential.json')

# get bucket with name
bucket = storage_client.get_bucket('yourbucket')

# get bucket data as blob
blob = bucket.get_blob('*')

# convert to string
json_data = blob.download_as_string()

Reference: