I am trying to import a table from MySql database to hive table to understand how hive import works.The table name is device which I have already imported in HDFS in home directory in HDFS. I created a table in hive using below statement.
create table device_hive (device_num int,device_name varchar(255));
Now I am executing below sqoop import statement to get the data from device
table in Mysql database to Hive
sqoop import --connect jdbc:mysql://localhost/loudacre --table device
--username training --password training --hive-import --hive-table device_hive
--hive-database hadoopexam --hive-table device_hive --fields-terminated-by '\001'
--table device --columns "device_num,device_name"
Its failing stating that output directory device is already exists.The location in error message points to the device
folder in HDFS which I imported using sqoop earlier.
My question is why sqoop is going to base directory and checking for that folder. It's a hive-import so shouldn't sqoop just go to hive/warehouse directory? I delete that folder from HDFS and it works fine. Any suggestions.