1
votes

So I have a 150 GB database that I want to import to Google Cloud SQL (it doesn't matter if its PSQL or MySQL.) The reason I want to do this is to have more flexible space, and faster computation.

However, I find no easy intro on how this is done. It seems like the way to do this is to also make a Google Cloud Storage instance, then dump my SQLite database to an SQL file, upload it to the Cloud Storage bucket, then import it to Google Cloud SQL. Is this the best and quickest way?

Dumping a 150 GB database would probably require lots of space and lots and lots of time.

1
You have labeled this as Google Cloud Datastore which is a noSql solution. I would remove the labelSjuul Janssen

1 Answers

0
votes

Ultimately, what you need to do is to:

  • make sql dump of your sqlite database
  • convert it to be mysql- or postgresql- compatible (either manually or using some tool)
  • upload it to Google
  • import into your CloudSQL instance

You can try to minimize intermediate steps by using something like https://github.com/dimitri/pgloader. It seems like you can use this tool to connect your sqlite database directly to CloudSQL instance. It will take time - there's no getting around trasferring ~150G worth of data to Google

Where is your sqlite database stored now? If it's already in GCE VM, running pgloader from the same region as your CloudSQL instance should make it much faster.