0
votes

I am using the "Copy Table" feature in BigQuery console to move data from US data centers to EU data centers. It worked fine for datasets with <1000 tables but I am having trouble using this with larger datasets:

  • source dataset includes 1634 tables
  • after copy table finishes, it includes 1439 tables

I know the limit of 1000 copies per day, the copy BQ data transfer task correctly schedules itself for the next day as well to continue with the tables above the 1000 limit. The task finishes without errors but still, many tables are not copied.

When I re-run the copy routine without checking the "overwrite destination tables", it goes over the tables again, correctly shows that it is skipping existing tables but for some reason does not see the missing tables that should be copied. The summary says "The transfer run has completed successfully." and in the log it says "Identified 0 tables to copy in the source dataset projectID.datasetID in region us. Skipped 1490 tables that already exist."

But there are 195 more, non-empty tables to be copied.

What am I missing?

1

1 Answers

0
votes

I do not know why but I just tried to re-run the task and now all the missing tables have been copied... Strange...