I want to "import-all-tables" using sqoop from mysql to a Hive Custom Database ( Not Hive default Database )
Steps tried:
- Create a custom database in hive under "/user/hive/warehouse/Custom.db"
- Assigned all permissions for this directory- so there will be NO issues in writing into this directory by sqoop.
Used below command with option "--hive-database" option on CDH5.7 VM :
sqoop import-all-tables --connect "jdbc:mysql://quickstart.cloudera:3306/retail_db" --username retail_dba --password cloudera --hive-database "/user/hive/warehouse/sqoop_import_retail.db"
Tables created in hive default database only, not in the custom DB in this case: "sqoop_import_retail.db"
Else its trying to creates tables in the previous HDFS directories (/user/cloudera/categories), and error out stating table already exists:
16/08/30 00:07:14 WARN security.UserGroupInformation: PriviledgedActionException as:cloudera (auth:SIMPLE) cause:org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory hdfs://quickstart.cloudera:8020/user/cloudera/categories already exists
16/08/30 00:07:14 ERROR tool.ImportAllTablesTool: Encountered IOException running import job: org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory hdfs://quickstart.cloudera:8020/user/cloudera/categories already exists
[cloudera@quickstart etc]$
How to address this issues?
1. Creating tables in hive custom DB
2. Flushing previous directory references with Sqoop.