6
votes

I have a database in a Cloud SQL instance. I would like to copy its content into BigQuery in order to perform analysis. It is not a requirement for me to continuously update the BigQuery dataset. It is OK if the export is done only once.

What is the best way to achieve this?

The 'Create Table' BigQuery UI does not allow me to import from Cloud SQL (only File, Cloud Storage, Drive or BigTable).

4
I would probably just export it to csv into GCS, and load it into BigQuery from there. That would be the easiest. stackoverflow.com/questions/27784743/…Graham Polley

4 Answers

2
votes

Up to now, there is no automated tool to import data into BigQuery from Cloud SQL, so a procedure you can follow consists in:

  1. Export the data from the table you want in your Cloud SQL instance in CSV format, as explained in the documentation.
  2. Import the CSV data into the BigQuery table you want following the procedure also explained in the documentation.

You are done. If your database is large and has many tables, you may want to do the import programatically, using the API.

8
votes

BigQuery can directly query Cloud SQL through Cloud SQL federated queries. It introduces a new SQL function called EXTERNAL_QUERY(connection_id, external_sql), which run the external_sql in the Cloud SQL database specified by connection_id.

You need to first create connection in BigQuery, then refer the connection_id in EXTERNAL_QUERY(). Following is a sample query to copy Cloud SQL data to BigQuery.

INSERT
  demo.customers (column1)
SELECT
   * 
FROM
   EXTERNAL_QUERY("project.us.connection",
                  "SELECT column1 FROM mysql_table;");
2
votes

After creating a connection to your CloudSQL server, you can use it to create a table from BigQuery in a single query.

CREATE TABLE CUSTOMER AS
SELECT * FROM EXTERNAL_QUERY("<your_connection_id>", "SELECT * FROM CUSTOMER");
0
votes

Updated solution: In beta now, you can use Cloud Data Fusion to do this very easily (supporting MySQL and SQL Server for now).