There are different paths to manage this, I think the easiest one is to do what Jim Morrison said, use the Cloud Datastore to JSON Dataflow template. Also with that approach there are different possibilities:
First download the public template to local with that command:
gsutil cp gs://dataflow-templates/latest/Datastore_to_GCS_Text .
(be careful do not delete the last . that you can change to the directory where you want to download it)
Then, as Jim Morrison explains you in his answer, edit the downloaded file and change the template parameters with your owns [1].
When that's done, upload it again to a bucket that you own. For example:
gsutil cp Datastore_to_GCS_Text gs://datastore_to_cloudsql/template/
And then you can run the job using GCP Console (by Create a job from template with Custom Template) [2]
When you have your JSON file, convert that to a CSV and import it to cloud SQL [3]. Also, maybe you want to add a function as the Template parameters allow [1] to directly convert the data exported to a CSV format and not to a JSON file.
The JSON format provided is a document filled with rows like this one that I formatted to make it clearer:
{
"key":{"partitionId":{"projectId":"MY_PROJECT_ID"},
"path":[{"kind":"MY_KIND_NAME","id":"4814888656437248"}]},
"properties":{
"MY_FIRST_COLUMN":{"integerValue_FOR_EXAMPLE":"3_INT_VALUE_EXAMPLE"},
"SECOND_COLUMN":{"stringValue_FOR_EXAMPLE":"foobarfoobarfoobar_FOR_EXAMPLE"},
"THIRD_COLUMN":{"stringValue_FOR_EXAMPLE":"foobar_FOR_EXAMPLE"}
}
}
1: https://cloud.google.com/dataflow/docs/templates/provided-templates#cloud-datastore-to-cloud-storage-text
2: https://cloud.google.com/dataflow/docs/templates/executing-templates#using-the-gcp-console
3: https://cloud.google.com/sql/docs/mysql/import-export/importing#importing_csv_files_to_title_short