0
votes

I am trying to SQOOP from SAP HANA database, My purpose is to do a direct hive import, I used the command as required (which works for most of the tables) but in some cases the import doesn't work as there are special characters in the SAP table name for e.g table name is "/BIC/AS100/" Due to the "/" in the table name.

I am unable to do a direct hive import. Is there any way I can import the table and create a new hive table with a proper name.

1
export to hdfs and create external table in hive pointing to the hdfs location. - Sathiyan S
I had already tried it - it works but the problem is - The table had many columns we need the column names as well to map . For e.g table has 50 columns and we sqoop it - we need to check the table structure and then create the table accordingly - what in case we do not have the facility to see the metadata of the table ? we can only sqoop the tables . - Srikant
Also I tried to use AVRO format and extract the schema in avsc file and create the table - in that case as well the column names have "/" at the beginning HIVE converts it to "underscore" but it is a problem to again as hive does not support columns starting with "underscore" - Srikant
So, I think the only option might be storing in hdfs and creating external table. :( - Sathiyan S

1 Answers

0
votes

Thanks , Sathiyan ,

the issue is resolved . I did a direct hive import specifying a new table name of choice . still the columns names are imported with the special character but we can handle that in hive . for e.g

Select `/bic/xyz` from tablename ; ( the back tick escapes the special character )