0
votes

I am trying to import data in hive table using sqoop command. The hive table is partitioned by date2 and date is in the format of "9/6/2017 00:00:00". It's throwing error when I use sqoop command to import data using the date column.

Teradata table : column1, date2, column3 1,9/6/2017 00:00:00, qwe 2,9/20/2017 00:00:00, wer

Sqoop command:

sqoop import \
--connect jdbc:teradata://<server>/database=<db_name> \
--connection-manager org.apache.sqoop.teradata.TeradataConnManager \
--username un \
--password 'pwd' \
--table <tbl_name> \
--where "cast(date2 as Date) > date '2017-09-07' and cast(date2 as Date) < date '2017-09-20'" \
--hive-import --hive-table <db_name>.<tbl_name> \
--hive-partition-key date2 \
-m1 

Error

ERROR teradata.TeradataSqoopImportHelper: Exception running Teradata import job java.lang.IllegalArgumentException:Wrong FS: /usr/tarun/date2=1900-01-01 00%3A00%3A00

1
I have created an external table in hive making partition with date2 column. - Vishnu

1 Answers

0
votes

When I tried translating your command to multiline, it looks you have missed one \ character and that's why it looks it is complaining. --hive-import is not ending with "\". The hive table name is also missing in the command

sqoop import \
--connect jdbc:teradata:///database= \ 
--connection-manager org.apache.sqoop.teradata.TeradataConnManager \
--username un \
--password 'pwd' \
--table \
--where "cast(date2 as Date) > date '2017-09-07' and cast(date2 as Date) < date '2017-09-20'" \
--hive-import \
--hive-table tarun121 \
--hive-partition-key date2 \
-m1

alternate to this is to try create-hive-table command

sqoop create-hive-table \
--connect jdbc:teradata:://localhost:port/schema \
--table hive_tble_name \ 
--fields-terminated-by ',';

let me know if this solves the issue.