2
votes

I'm trying to transfer an entire table from BigQuery to Cloud SQL. After researching I downloaded the table into csv format files into a GCS bucket. Then I created a Cloud SQL MySQL instance, a database and a table with the same schema as the BigQuery table.

Now, I'm following the instructions on here: https://cloud.google.com/sql/docs/mysql/import-export/importing to get those csv files imported into the Cloud SQL database.

It works for a single file but if I try to use * like gs://bucket/fileprefix_* I get an error: ERROR: (gcloud.sql.import.csv) HTTPError 403: The service account does not have the required permissions for the bucket.

My table is approx 52GB and sharded into 69 csv files when exporting to the bucket.

I've tried gsutil acl ch -u [SERVICE_ACCOUNT_ADDRESS]:W gs://[BUCKET_NAME]/

gsutil acl ch -u [SERVICE_ACCOUNT_ADDRESS]:R gs://[BUCKET_NAME]/[IMPORT_FILE_NAME]

gsutil acl ch -u [SERVICE_ACCOUNT_ADDRESS]:O gs://[BUCKET_NAME]/[IMPORT_FILE_NAME]

And I've also gone to IAM and edited the permissions of the service account but that didn't help.

1
what permissions does the service account now have? I believe you need storage.objects.list when using wildcards.Graham Polley

1 Answers

1
votes

As Graham mentioned in the comment, what permissions you've granted is likely the culprit.

He mentioned storage.objects.list, which is for sure you'd need, but I THINK you may also need storage.buckets.get.

My reasoning is that for a single object access, you don't need to know anything about the bucket at all...but to get to the full list of objects, it has to be retrieved by the bucket meta tag, which means needing permission to the bucket itself. I'm not sure, but give that a shot if you don't already have those two permissions set on the service account.