I want to create a Cloud Function (which shall be executed daily at 01:00). The function should
- generate a dataframe
- [export as dataframe.csv] <---- not sure if required
- push the dataframe(or .csv) to a bucket
.....
- Question1: Is it possible to push a dataframe to a bucket?
- Question2: How can I create a .csv file within the CloudFunction(CF), that it can be pushed to a bucket?
Updated code now: (still giving error)
def push_cars( data ): ## <<----- not sure how many paramter &why??
import requests
import pandas as pd
import os
from datetime import datetime
from google.cloud.storage.blob import Blob
from google.cloud import storage
#import csv # <<--- not sure if required???
cars_dict = {'Brand': ['Honda Civic','Toyota Corolla','Ford Focus','Audi A4'],
'Price': [22000,25000,27000,35000]
}
cars = pd.DataFrame(cars_dict, columns = ['Brand', 'Price'])
timestamp = datetime.now().strftime("%Y_%m_%d-%H_%M_%S")
name = "cars_" + timestamp + ".csv"
cars.to_csv( "/tmp/test.csv" ,index=False)
with open('/tmp/test.csv', "w") as csv:
csv.write(name)
os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = "My-project.json"
target_bucket = 'cars:python_gogo'
storage_client = storage.Client()
bucket = storage_client.get_bucket( target_bucket )
data = bucket .blob( name_output )
For replication on the cloud, you need to create a requirements.txt with following content:
requests
pandas
google-cloud-storage
datetime
In the cloud shell, i am using following to deploy this CF: gcloud functions deploy push_cars--entry-point=push_cars--runtime=python37 --memory=1024MB --region=us-east1 --allow-unauthenticated --trigger-http