1
votes

I'm trying to estimate the total monthly cost of my Google Cloud program.

What my program does is that it loads the input file from the Amazon S3 server into Google's Cloud Storage platform. It then uses this imported data to create a table in Google BigQuery and later exports the created table back into Cloud Storage in the JSON format.

Say I have a file that is 50 GB in size and it resides in Amazon S3.

My program would import this 50 GB data file into Cloud Storage, Load it into a table in BigQuery and finally export it back to Cloud Storage.

I've estimated the 50 GB file to contain 600,000,000 (600M) rows.

In the Google Price Calculator for cloud storage:

I've put the 'Storage Data' field to 100 GB (50 GB input file, 50 GB exported file).

I have set the 'Entity Reads' field (assuming that each entity is a row in my table) to 600,000,000 (A read operation has to be performed to load the data into BigQuery)

And finally, I have set the 'Entity Write' field to 1,200,000,000 (One write operation when importing the data into Cloud Storage from S3 and the other one when exporting the data from BigQuery)

This gives me a $2,530.52 monthly cost estimate which I find to be quite high for a 50 GB data file.

What I'd like to know is whether I have got the estimation values right?

Also apart from the Cloud Storage costs, what other costs will I incur for my program? (As I'm using the Google Cloud Transfer services as well)

1
Thanks for the information.noobcoder

1 Answers

3
votes

Not sure what page you filled in but there is no field like Entity Reads on the BigQuery page.

Here is the calculation for:

Storage 100 GB
Streaming Inserts 0 MB
Queries 10 TB
Total Estimated Cost: $47.00 per 1 month 

https://cloud.google.com/products/calculator/#id=b395df1a-1fa8-4e7d-9dce-8a1a578916a8

Please note that load/export from/to file is free, you just need to pay the storage of the file.