0
votes

I have an oracle table which has 80 columns and id partitioned on state column. My requirement is to create a hive table with similar schema of oracle table and partitioned on state.

I tried using sqoop -create-hive-table option. But keep getting an error

ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.IllegalArgumentException: Partition key state cannot be a column to import.

I understand that in Hive the partitioned column should not be in table definition, but then how do I get around the issue?

I do not want to manually write create table command, as I have 50 such tables to import and would like to use sqoop.

Any suggestion or ideas?

Thanks

1

1 Answers

0
votes

There is a turn around for this. Below is the procedure i fallow :

  1. On Oracle run query to get the schema for a table and store it to a file.
  2. Move that file to Hadoop
  3. On Hadoop create a shell script which constructs a HQL file.
  4. That hql file contains "Hive create table statement along with columns". For this we can use the above file(Oracle schema file copied to hadoop).
  5. For this script to run u need to just pass Hive database name,table name, partition column name,path, etc.. depending on u r customization level.At the end of this shell script add "hive -f HQL filename".
  6. If everything is ready it just takes couple of mins for each table creation.