0
votes

I have to create a hive table from data present in oracle tables. I'm doing a sqoop, thereby converting the oracle data into HDFS files. Then I'm creating a hive table on the HDFS files. The sqoop completes successfully and the files also get generated in the HDFS target directory. Then I run the create table script in hive. The tables gets created. But it is an empty table, no data is seen in the hive table.

Has anyone faced a similar problem?

2
Please cross check which HDFS directory you are importing HDFS data. If you are creating hive table then location should be the same. Please provide the imported directory and your create table statement - Sandeep Singh
Cross Checked the HDFS location and the directory specified in the Create statement. They are exactly the same. Checked the metadata of the created hive table, that too points to the HDFS location. - Jonathan
Imported Directory: /user/rajendrap/webdata/WPT_Booking/Stage_Web_Prod_Trav_Booking Create Script: CREATE TABLE DEFAULT.STAGE_WEB_PROD_TRAV_BOOKING ( SHIPCODE STRING, PORTCODE STRING, DURATION STRING, SAILDATE STRING, NUM_OF_TRAVELLERS DECIMAL, NUM_OF_BOOKINGS DECIMAL, NUM_OF_BOOKING_WEB DECIMAL ) PARTITIONED BY (RDATE string) LOCATION '/user/rajendrap/webdata/WPT_Booking/Stage_Web_Prod_Trav_Booking'; - Jonathan

2 Answers

1
votes

Hive default delimiter is ctrlA, if you don't specify any delimiter it will take default delimiter. Add below line in your hive script .

row format delimited fields terminated by '\t'

0
votes

Your Hive script and your expectation is wrong. You are trying to create a partitioned table on the data that you have already imported, partitions won't work that way. If your query has no partition in it then you can able to see data.

Basically If you want partitioned table , you can't create on the under lying data like you have tried above. If you want hive partition load the data from intermediate table or that sqoop directory to your partitioned table to get Hive partitions.