1
votes

I am trying to copy data from one Bigtable table to another Bigtable table but not finding any direct way to do it. There is an option to copy data from a Bigtable table to Google Storage and then back to Bigtable from the Storage file, but it is time taking process. So can someone help me to suggest anything?

1

1 Answers

2
votes

It seems that indeed you cannot copy directly a table between BigTable instances. However, you can write a script to use gcloud commands to automatize the process of exporting your table to Cloud Storage and then importing it to the destination BigTable instance.

You can find more information on how to write the gcloud commands for this process here:

1) Exporting to Cloud Storage: https://cloud.google.com/dataflow/docs/guides/templates/provided-batch#running-the-cloud-bigtable-to-cloud-storage-avro-file-template

2) Importing to BigTable: https://cloud.google.com/dataflow/docs/guides/templates/provided-batch#running-the-cloud-storage-avro-file-to-cloud-bigtable-template

If you are interested on making this copy for the sake of making a backup you might be interesting on enabling BigTable replication instead: https://cloud.google.com/bigtable/docs/replication-overview