I have an App Engine scheduled job which runs everyday and look for rows in a PostgreSQL table (hosted in gcp not a cloudsql) which meets a criteria to archive. If the criteria is met, it connects to BigQuery and streams the data to big query. Everyday there are few records qualify for archiving and we write to BigQuery. Is this the cost effective way or we can try loading data using Cloud Functions? https://cloud.google.com/solutions/performing-etl-from-relational-database-into-bigquery
0
votes
1 Answers
0
votes
App Engine and Cloud Functions have different purposes. You should use App Engine if you want to deploy a full application in a serverless environment. If you need to integrate services in the cloud, use Cloud Function. In your case it seems that Cloud Functions fits better.
It's important to remember that Cloud Function has a time limitation: the maximum time which your code has to run is 9 minutes. You can find this and other limitations here
Furthermore, you can find here a pricing calculator for GCP products.
If you have any further questions, please let me know.