0
votes

I am working on a project which crunching data and doing a lot of processing. So I chose to work with BigQuery as it has good support to run analytical queries. However, the final result that is computed is stored in a table that has to power my webpage (used as a Transactional/OLTP). My understanding is, BigQuery is not suitable for transactional queries. I was looking more into other alternatives and I realized I can use DataFlow to do analytical processing and move the data to Cloud SQL (relationalDb fits my purpose).

However, It seems, it's not as straightforward as it seems. First I have to create a pipeline to move the data to the GCP bucket and then move it to Cloud SQL.

Is there a better way to manage it? Can I use "Create Job from SQL" in the dataflow to do it? I haven't found any examples which use "Create Job From SQL" to process and move data to GCP Cloud SQL.

Consider a simple example on Robinhood: Compute the user's returns by looking at his portfolio and show the graph with the returns for every month.

enter image description here

1
gcp has API for this if you work by python you can easily move your data to the BigQuery.Maryam Abdoli
I want to move the data from BigQuery to CloudSQL. It seems the pipeline I have to build will always go through GCP bucket to create a temp file before moving it to cloudSQL.LifeStartsAtHelloWorld
@LifeStartsAtHelloWorld Can you please explain how you have moved data from BigQuery to Cloud SQL?Ameet

1 Answers

0
votes

There are other options, beside pipeline use, but in all cases you cannot export table data to a local file, to Sheets, or to Drive. The only supported export location is Cloud Storage, as stated on the Exporting table data documentation page.