0
votes

I am setting up the export/import routine for our Bigtable .I was able to successfully export a BigTable table into Avro files using dataflow job template "Cloud Bigtable to Avro files on Cloud Storage.". However, when I try to import the BigTable table with the corresponding export files, it gives me the following error:

NOT_FOUND: Error while mutating the row 'C\035I\370\331\314G\346\204\003;S\333\312Ee0\024K\353\\000\372\300;\232\312\001' (projects/tvc-project/instances/dave-backup-test/tables/Tabledave) : Requested column family not found Error mutating row ….with mutations [set_cell { family_name: "f" column_qualifier: "last_updated" timestamp_micros: 1542667887527000 value: "\000\000\001g.+$Y"

This occurred when the table did not exist in the BigTable instance and after I created the table and the corresponding family (mentioned in the error) in the BigTable instance. I created the import dataflow job with the Cloud Dataflow template "Avro files on Cloud Storage to Cloud Bigtable".

Any assistance is very much appreciated.

1

1 Answers

1
votes

Found out the problem. An additional family was created in the Bigtable table. Once the additional family was added, everything worked fine.