0
votes

I am using the following command to fetch data from mysql table to a hive table:

sqoop import \
--connect jdbc:mysql://xx.xx.xx.xx/orderdbms \
--username=orderuser \
--password=orderpass \
--table=order \
--where="DATE(created)='2015-08-20'" \
--hive-import \
--hive-table=orderstat.order \
--target-dir=/user/ordermanager/sqoopdata/orders \
--direct

I am getting the following error while doing the above:

Error reading from database: java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@1f16ebd3 is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries. java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@1f16ebd3 is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries. at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:914) at com.mysql.jdbc.MysqlIO.checkForOutstandingStreamingData(MysqlIO.java:2181) at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1542) at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:1723) at com.mysql.jdbc.Connection.execSQL(Connection.java:3277) at com.mysql.jdbc.Connection.execSQL(Connection.java:3206) at com.mysql.jdbc.Statement.executeQuery(Statement.java:1232) at com.mysql.jdbc.Connection.getMaxBytesPerChar(Connection.java:3673) at com.mysql.jdbc.Field.getMaxBytesPerCharacter(Field.java:482) at com.mysql.jdbc.ResultSetMetaData.getPrecision(ResultSetMetaData.java:443) at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:286) at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:241) at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:227) at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:295) at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1833) at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1645) at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107) at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:478) at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605) at org.apache.sqoop.Sqoop.run(Sqoop.java:143) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227) at org.apache.sqoop.Sqoop.main(Sqoop.java:236) 15/08/24 11:54:46 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: No columns to generate for ClassWriter at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1651) at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107) at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:478) at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605) at org.apache.sqoop.Sqoop.run(Sqoop.java:143) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227) at org.apache.sqoop.Sqoop.main(Sqoop.java:236)

Can anyone please explain what is causing this? Is there a problem with the syntax or some connectivity issues between client & mysql server.

1
I am not aware of hive and sqoop. Reading error stack message, I think sqoop import is not completed before you issued a hive import and hence you may require to close a connection and start 2nd import explicit.Ravinder Reddy
can you try changing mysql jar into latest version jar in the sqoop lib directoryChennakrishna

1 Answers

0
votes

Try to add the option

--driver com.mysql.jdbc.Driver