I'm trying to transfer an entire table from BigQuery to Cloud SQL. After researching I downloaded the table into csv format files into a GCS bucket. Then I created a Cloud SQL MySQL instance, a database and a table with the same schema as the BigQuery table.
Now, I'm following the instructions on here: https://cloud.google.com/sql/docs/mysql/import-export/importing to get those csv files imported into the Cloud SQL database.
It works for a single file but if I try to use * like gs://bucket/fileprefix_* I get an error: ERROR: (gcloud.sql.import.csv) HTTPError 403: The service account does not have the required permissions for the bucket.
My table is approx 52GB and sharded into 69 csv files when exporting to the bucket.
I've tried gsutil acl ch -u [SERVICE_ACCOUNT_ADDRESS]:W gs://[BUCKET_NAME]/
gsutil acl ch -u [SERVICE_ACCOUNT_ADDRESS]:R gs://[BUCKET_NAME]/[IMPORT_FILE_NAME]
gsutil acl ch -u [SERVICE_ACCOUNT_ADDRESS]:O gs://[BUCKET_NAME]/[IMPORT_FILE_NAME]
And I've also gone to IAM and edited the permissions of the service account but that didn't help.
storage.objects.list
when using wildcards. – Graham Polley