4
votes

I'm trying to import mysql table to Hive using Sqoop Import, however after the command execution, the CLI stays calm nothing happens and it hangs indefinitely. Below is the command and issue details..

[cloudera@quickstart bin]$ sqoop create-hive-table --connect jdbc:mysql://10.X.X.XX:XXXX/rkdb --username root -P --table employee --hive-table emps

Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail. Please set $ACCUMULO_HOME to the root of your Accumulo installation. 17/08/30 22:20:02 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.12.0 Enter password: 17/08/30 22:20:05 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override 17/08/30 22:20:05 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc. 17/08/30 22:20:05 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset. 17/08/30 22:20:06 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM employee AS t LIMIT 1 17/08/30 22:20:06 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM employee AS t LIMIT 1 17/08/30 22:20:08 INFO hive.HiveImport: Loading uploaded data into Hive

Logging initialized using configuration in jar:file:/usr/lib/hive/lib/hive-common-1.1.0-cdh5.12.0.jar!/hive-log4j.properties

Any clue is much appreciated.

Edit : Included the --verbose in the command to get the detail logging :

  Note: /tmp/sqoop-cloudera/compile/5b58dd4e681df3737c3f8ce4f32013ac/employee.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
17/08/31 01:34:13 DEBUG orm.CompilationManager: Could not rename /tmp/sqoop-cloudera/compile/5b58dd4e681df3737c3f8ce4f32013ac/employee.java to /usr/lib/spark/examples/lib/./employee.java
17/08/31 01:34:13 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-cloudera/compile/5b58dd4e681df3737c3f8ce4f32013ac/employee.jar
17/08/31 01:34:13 DEBUG orm.CompilationManager: Scanning for .class files in directory: /tmp/sqoop-cloudera/compile/5b58dd4e681df3737c3f8ce4f32013ac
17/08/31 01:34:13 DEBUG orm.CompilationManager: Got classfile: /tmp/sqoop-cloudera/compile/5b58dd4e681df3737c3f8ce4f32013ac/employee$1.class -> employee$1.class
17/08/31 01:34:13 DEBUG orm.CompilationManager: Got classfile: /tmp/sqoop-cloudera/compile/5b58dd4e681df3737c3f8ce4f32013ac/employee$2.class -> employee$2.class
17/08/31 01:34:13 DEBUG orm.CompilationManager: Got classfile: /tmp/sqoop-cloudera/compile/5b58dd4e681df3737c3f8ce4f32013ac/employee$3.class -> employee$3.class
17/08/31 01:34:13 DEBUG orm.CompilationManager: Got classfile: /tmp/sqoop-cloudera/compile/5b58dd4e681df3737c3f8ce4f32013ac/employee$4.class -> employee$4.class
17/08/31 01:34:13 DEBUG orm.CompilationManager: Got classfile: /tmp/sqoop-cloudera/compile/5b58dd4e681df3737c3f8ce4f32013ac/employee$FieldSetterCommand.class -> employee$FieldSetterCommand.class
17/08/31 01:34:13 DEBUG orm.CompilationManager: Got classfile: /tmp/sqoop-cloudera/compile/5b58dd4e681df3737c3f8ce4f32013ac/employee.class -> employee.class
17/08/31 01:34:13 DEBUG orm.CompilationManager: Finished writing jar file /tmp/sqoop-cloudera/compile/5b58dd4e681df3737c3f8ce4f32013ac/employee.jar
17/08/31 01:34:13 WARN manager.MySQLManager: It looks like you are importing from mysql.
17/08/31 01:34:13 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
17/08/31 01:34:13 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
17/08/31 01:34:13 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
17/08/31 01:34:13 DEBUG manager.MySQLManager: Rewriting connect string to jdbc:mysql://10.0.2.15:3306/rkdb?zeroDateTimeBehavior=convertToNull
17/08/31 01:34:13 INFO mapreduce.ImportJobBase: Beginning import of employee
17/08/31 01:34:13 DEBUG util.ClassLoaderStack: Checking for existing class: employee
17/08/31 01:34:13 DEBUG util.ClassLoaderStack: Attempting to load jar through URL: jar:file:/tmp/sqoop-cloudera/compile/5b58dd4e681df3737c3f8ce4f32013ac/employee.jar!/
17/08/31 01:34:13 DEBUG util.ClassLoaderStack: Previous classloader is sun.misc.Launcher$AppClassLoader@7d487b8b
17/08/31 01:34:13 DEBUG util.ClassLoaderStack: Testing class in jar: employee
17/08/31 01:34:13 DEBUG util.ClassLoaderStack: Loaded jar into current JVM: jar:file:/tmp/sqoop-cloudera/compile/5b58dd4e681df3737c3f8ce4f32013ac/employee.jar!/
17/08/31 01:34:13 DEBUG util.ClassLoaderStack: Added classloader for jar /tmp/sqoop-cloudera/compile/5b58dd4e681df3737c3f8ce4f32013ac/employee.jar: java.net.FactoryURLClassLoader@2ea3741
17/08/31 01:34:16 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
17/08/31 01:34:16 DEBUG db.DBConfiguration: Securing password into job credentials store
17/08/31 01:34:16 DEBUG mapreduce.DataDrivenImportJob: Using table class: employee
17/08/31 01:34:16 DEBUG mapreduce.DataDrivenImportJob: Using InputFormat: class com.cloudera.sqoop.mapreduce.db.DataDrivenDBInputFormat
17/08/31 01:34:19 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
17/08/31 01:34:19 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/sqoop-1.4.6-cdh5.12.0.jar
17/08/31 01:34:19 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/share/java/mysql-connector-java-5.1.34-bin.jar
17/08/31 01:34:19 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/sqoop-1.4.6-cdh5.12.0.jar
17/08/31 01:34:19 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/sqoop-1.4.6-cdh5.12.0.jar
17/08/31 01:34:19 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/avro.jar
17/08/31 01:34:19 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/commons-lang3-3.4.jar
17/08/31 01:34:19 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/jackson-mapper-asl-1.8.8.jar
17/08/31 01:34:19 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/ant-contrib-1.0b3.jar
17/08/31 01:34:19 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/parquet-hadoop.jar
17/08/31 01:34:19 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/kite-hadoop-compatibility.jar
17/08/31 01:34:19 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/commons-logging-1.1.3.jar
17/08/31 01:34:19 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/slf4j-api-1.7.5.jar
17/08/31 01:34:19 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/opencsv-2.3.jar
17/08/31 01:34:19 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
17/08/31 01:34:19 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/xz-1.0.jar
17/08/31 01:34:19 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/commons-jexl-2.1.1.jar
17/08/31 01:34:19 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/parquet-format.jar
17/08/31 01:34:19 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/commons-compress-1.4.1.jar
17/08/31 01:34:19 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/parquet-avro.jar
17/08/31 01:34:19 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/commons-codec-1.4.jar
17/08/31 01:34:19 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/parquet-encoding.jar
17/08/31 01:34:19 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/parquet-jackson.jar
17/08/31 01:34:19 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/kite-data-core.jar
17/08/31 01:34:19 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/jackson-core-2.3.1.jar
17/08/31 01:34:19 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/kite-data-mapreduce.jar
17/08/31 01:34:19 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/parquet-column.jar
17/08/31 01:34:19 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/avro-mapred-hadoop2.jar
17/08/31 01:34:19 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/jackson-databind-2.3.1.jar
17/08/31 01:34:19 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/paranamer-2.3.jar
17/08/31 01:34:19 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/snappy-java-1.0.4.1.jar
17/08/31 01:34:19 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/hsqldb-1.8.0.10.jar
17/08/31 01:34:19 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/kite-data-hive.jar
17/08/31 01:34:19 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/commons-io-1.4.jar
17/08/31 01:34:19 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/jackson-annotations-2.3.1.jar
17/08/31 01:34:19 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/jackson-core-asl-1.8.8.jar
17/08/31 01:34:19 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/parquet-common.jar
17/08/31 01:34:19 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/lib/sqoop/lib/fastutil-6.3.jar
17/08/31 01:34:20 INFO client.RMProxy: Connecting to ResourceManager at quickstart.cloudera/10.0.2.15:8032
17/08/31 01:34:23 WARN hdfs.DFSClient: Caught exception 
java.lang.InterruptedException
    at java.lang.Object.wait(Native Method)
    at java.lang.Thread.join(Thread.java:1281)
    at java.lang.Thread.join(Thread.java:1355)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:952)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:690)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:879)
17/08/31 01:34:25 WARN hdfs.DFSClient: Caught exception 
java.lang.InterruptedException
    at java.lang.Object.wait(Native Method)
    at java.lang.Thread.join(Thread.java:1281)
    at java.lang.Thread.join(Thread.java:1355)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:952)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:690)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:879)
17/08/31 01:34:26 DEBUG db.DBConfiguration: Fetching password from job credentials store
17/08/31 01:34:26 INFO db.DBInputFormat: Using read commited transaction isolation
17/08/31 01:34:26 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`id`), MAX(`id`) FROM `employee`
17/08/31 01:34:26 INFO db.IntegerSplitter: Split size: 125; Num splits: 4 from: 100 to: 600
17/08/31 01:34:26 DEBUG db.IntegerSplitter: Splits: [                         100 to                          600] into 4 parts
17/08/31 01:34:26 DEBUG db.IntegerSplitter:                          100
17/08/31 01:34:26 DEBUG db.IntegerSplitter:                          225
17/08/31 01:34:26 DEBUG db.IntegerSplitter:                          350
17/08/31 01:34:26 DEBUG db.IntegerSplitter:                          475
17/08/31 01:34:26 DEBUG db.IntegerSplitter:                          600
17/08/31 01:34:26 DEBUG db.DataDrivenDBInputFormat: Creating input split with lower bound '`id` >= 100' and upper bound '`id` < 225'
17/08/31 01:34:26 DEBUG db.DataDrivenDBInputFormat: Creating input split with lower bound '`id` >= 225' and upper bound '`id` < 350'
17/08/31 01:34:26 DEBUG db.DataDrivenDBInputFormat: Creating input split with lower bound '`id` >= 350' and upper bound '`id` < 475'
17/08/31 01:34:26 DEBUG db.DataDrivenDBInputFormat: Creating input split with lower bound '`id` >= 475' and upper bound '`id` <= 600'
17/08/31 01:34:26 WARN hdfs.DFSClient: Caught exception 
java.lang.InterruptedException
    at java.lang.Object.wait(Native Method)
    at java.lang.Thread.join(Thread.java:1281)
    at java.lang.Thread.join(Thread.java:1355)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.closeResponder(DFSOutputStream.java:952)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.endBlock(DFSOutputStream.java:690)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:879)
17/08/31 01:34:26 INFO mapreduce.JobSubmitter: number of splits:4
17/08/31 01:34:27 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1504153900489_0005
17/08/31 01:34:29 INFO impl.YarnClientImpl: Submitted application application_1504153900489_0005
17/08/31 01:34:29 INFO mapreduce.Job: The url to track the job: http://quickstart.cloudera:8088/proxy/application_1504153900489_0005/
17/08/31 01:34:29 INFO mapreduce.Job: Running job: job_1504153900489_0005
17/08/31 01:34:52 INFO mapreduce.Job: Job job_1504153900489_0005 running in uber mode : false
17/08/31 01:34:52 INFO mapreduce.Job:  map 0% reduce 0%
17/08/31 01:37:25 INFO mapreduce.Job:  map 50% reduce 0%
17/08/31 01:38:48 INFO mapreduce.Job:  map 100% reduce 0%
17/08/31 01:38:50 INFO mapreduce.Job: Job job_1504153900489_0005 completed successfully
17/08/31 01:38:51 INFO mapreduce.Job: Counters: 30
    File System Counters
        FILE: Number of bytes read=0
        FILE: Number of bytes written=609708
        FILE: Number of read operations=0
        FILE: Number of large read operations=0
        FILE: Number of write operations=0
        HDFS: Number of bytes read=409
        HDFS: Number of bytes written=158
        HDFS: Number of read operations=16
        HDFS: Number of large read operations=0
        HDFS: Number of write operations=8
    Job Counters 
        Launched map tasks=4
        Other local map tasks=4
        Total time spent by all maps in occupied slots (ms)=234310144
        Total time spent by all reduces in occupied slots (ms)=0
        Total time spent by all map tasks (ms)=457637
        Total vcore-milliseconds taken by all map tasks=457637
        Total megabyte-milliseconds taken by all map tasks=234310144
    Map-Reduce Framework
        Map input records=6
        Map output records=6
        Input split bytes=409
        Spilled Records=0
        Failed Shuffles=0
        Merged Map outputs=0
        GC time elapsed (ms)=8848
        CPU time spent (ms)=7840
        Physical memory (bytes) snapshot=445177856
        Virtual memory (bytes) snapshot=2910158848
        Total committed heap usage (bytes)=188219392
    File Input Format Counters 
        Bytes Read=0
    File Output Format Counters 
        Bytes Written=158
17/08/31 01:38:51 INFO mapreduce.ImportJobBase: Transferred 158 bytes in 271.6628 seconds (0.5816 bytes/sec)
17/08/31 01:38:51 INFO mapreduce.ImportJobBase: Retrieved 6 records.
17/08/31 01:38:51 DEBUG util.ClassLoaderStack: Restoring classloader: sun.misc.Launcher$AppClassLoader@7d487b8b
17/08/31 01:38:51 DEBUG hive.HiveImport: Hive.inputTable: employee
17/08/31 01:38:51 DEBUG hive.HiveImport: Hive.outputTable: rkdb_hive.employee11
17/08/31 01:38:51 DEBUG manager.SqlManager: Execute getColumnInfoRawQuery : SELECT t.* FROM `employee` AS t LIMIT 1
17/08/31 01:38:51 DEBUG manager.SqlManager: No connection paramenters specified. Using regular API for making connection.
17/08/31 01:38:52 DEBUG manager.SqlManager: Using fetchSize for next query: -2147483648
17/08/31 01:38:52 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `employee` AS t LIMIT 1
17/08/31 01:38:52 DEBUG manager.SqlManager: Found column id of type [4, 11, 0]
17/08/31 01:38:52 DEBUG manager.SqlManager: Found column name of type [12, 20, 0]
17/08/31 01:38:52 DEBUG manager.SqlManager: Found column dept of type [12, 10, 0]
17/08/31 01:38:52 DEBUG manager.SqlManager: Found column salary of type [4, 10, 0]
17/08/31 01:38:52 DEBUG hive.TableDefWriter: Create statement: CREATE TABLE `rkdb_hive.employee11` ( `id` INT, `name` STRING, `dept` STRING, `salary` INT) COMMENT 'Imported by sqoop on 2017/08/31 01:38:52' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\054' LINES TERMINATED BY '\012' STORED AS TEXTFILE
17/08/31 01:38:52 DEBUG hive.TableDefWriter: Load statement: LOAD DATA INPATH 'hdfs://quickstart.cloudera:8020/user/cloudera/ImportSqoop12' INTO TABLE `rkdb_hive.employee11`
17/08/31 01:38:52 INFO hive.HiveImport: Loading uploaded data into Hive
17/08/31 01:38:52 DEBUG hive.HiveImport: Using in-process Hive instance.
17/08/31 01:38:52 DEBUG util.SubprocessSecurityManager: Installing subprocess security manager

Logging initialized using configuration in jar:file:/usr/lib/hive/lib/hive-common-1.1.0-cdh5.12.0.jar!/hive-log4j.properties
2
Try to split your data using -split-by option. See here suggestions about split column: stackoverflow.com/a/37389134/2700344leftjoin

2 Answers

1
votes

Please include import and --hive-import in your sqoop import command.

0
votes

Your Sqoop job has completed successfully. And the job is stuck while loading data into Hive which depends on Zookeeper. Just turn on the ZooKeeper service manually from the Cloudera Manager.