1
votes

I am trying to write a new file (not upload an existing file) to a Google Cloud Storage bucket from inside a Python Google Cloud Function.

  • I tried using google-cloud-storage but it does not have the "open" attribute for the bucket.

  • I tried to use the App Engine library GoogleAppEngineCloudStorageClient but the function cannot deploy with this dependencies.

  • I tried to use gcs-client but I cannot pass the credentials inside the function as it requires a JSON file.

Any ideas would be much appreciated.

Thanks.

2

2 Answers

2
votes

You have to create your file locally and then to push it to GCS. You can't create a file dynamically in GCS by using open.

For this, you can write in the /tmp directory which is an in memory file system. By the way, you will never be able to create a file bigger than the amount of the memory allowed to your function minus the memory footprint of your code. With a function with 2Gb, you can expect a max file size of about 1.5Gb.

Note: GCS is not a file system, and you don't have to use it like this

1
votes
 from google.cloud import storage
 import io

 # bucket name
 bucket = "my_bucket_name"

 # Get the bucket that the file will be uploaded to.
 storage_client = storage.Client()
 bucket = storage_client.get_bucket(bucket)

 # Create a new blob and upload the file's content.
 my_file = bucket.blob('media/teste_file01.txt')

 # create in memory file
 output = io.StringIO("This is a test \n")

 # upload from string
 my_file.upload_from_string(output.read(), content_type="text/plain")

 output.close()

 # list created files
 blobs = storage_client.list_blobs(bucket)
 for blob in blobs:
     print(blob.name)

# Make the blob publicly viewable.
my_file.make_public()