I tried creating an table in Hive and wanted to export it as Avro format.
Eventually I want to load this avro file to Google BigQuery. For some reason after the export the AVRO schema is not having the correct column names.
create table if not exists test_txt (id int, name varchar(40));
insert into test values (1, "AK");
insert overwrite directory "/tmp/test" stored as avro select * from test;
!sh hadoop fs -cat /tmp/test/*;
Output should have the column name as id, name but translated as _col0, _col1.
Objavro.schema▒{"type":"record","name":"baseRecord","fields":[{"name":"_col0","type":["null","int"],"default":null},{"name":"_col1","type":["null",{"type":"string","logicalType":"varchar","maxLength":40}],"default":null}]}▒Bh▒▒δ*@▒x~
AK▒Bh▒▒δ*@▒x~
Thanks,
AK