I'm trying to import a SQL dump file from Google Cloud Storage into Cloud SQL (Postgres database) as a daily job.
I saw on Google Documentation for the CloudAPI that there was a way to programmatically import a SQL dump file (URL: https://cloud.google.com/sql/docs/postgres/admin-api/v1beta4/instances/import#examples), but quite honestly, I'm a bit lost here. I haven't programmed using APIs before, and I think this is a major factor here.
In the documentation, I see that there's an area for a HTTP POST request, as well as code, but I'm not sure where this would go. Ideally, I'd like to use other Cloud products to make this daily job happen. Any help would be much appreciated.
(Side note: I was looking into creating a cron job in Compute Engine for this, but I'm worried about ease of maintenance, especially since I have other jobs I want to build that are dependent on this one.
I'd read that Dataflow could help with this, but I haven't seen anything (tutorials) that suggests it can yet. I'm also fairly new to Dataflow, so that could be a factor as well. )
google-cloud-composer
which is essentiallyairflow
for this. There are a lot of Operators to move files between various locations. You can find more information here: cloud.google.com/composer/docs/quickstart – Gaurav Tanejairflow.contrib.operators.mysql_to_gcs.MySqlToGoogleCloudStorageOperator
but am not sure if that would help. IfMySqlHook
works then I would use that andGoogleCloudStorageHook
and create a custom operator or use them withPythonOperator
. That being said BashOperator should also work but I have not tried it. – Gaurav Taneja