0
votes

I am trying to sqoop a table's data from HIVE to Teradata and got the error

Error: com.teradata.connector.common.exception.ConnectorException: java.sql.SQLException: [Teradata JDBC Driver] [TeraJDBC 15.00.00.20] [Error 1186] [SQLState HY000] Parameter 8 length is 67618 bytes, which is greater than the maximum 64000 bytes that can be set.

Can anyone please suggest exactly what change I have to do here? Column-8 is too long string in HIVE table and that is why I have defined the data type in TERADATA as VARCHAR(50000), but still it is failing.

Error: com.teradata.connector.common.exception.ConnectorException: java.sql.SQLException: [Teradata JDBC Driver] [TeraJDBC 15.00.00.20] [Error 1186] [SQLState HY000] Parameter 8 length is 67618 bytes, which is greater than the maximum 64000 bytes that can be set.
    at com.teradata.jdbc.jdbc_4.util.ErrorFactory.makeDriverJDBCException(ErrorFactory.java:94)
    at com.teradata.jdbc.jdbc_4.util.ErrorFactory.makeDriverJDBCException(ErrorFactory.java:74)
    at com.teradata.jdbc.jdbc_4.TDPreparedStatement.internalSetString(TDPreparedStatement.java:1121)
    at com.teradata.jdbc.jdbc_4.TDPreparedStatement.setString(TDPreparedStatement.java:1095)
    at com.teradata.jdbc.jdbc_4.TDPreparedStatement.setObject(TDPreparedStatement.java:1631)
    at com.teradata.connector.teradata.TeradataObjectArrayWritable.write(TeradataObjectArrayWritable.java:232)
    at com.teradata.connector.teradata.TeradataBatchInsertOutputFormat$TeradataRecordWriter.write(TeradataBatchInsertOutputFormat.java:142)
    at com.teradata.connector.teradata.TeradataBatchInsertOutputFormat$TeradataRecordWriter.write(TeradataBatchInsertOutputFormat.java:114)
    at com.teradata.connector.common.ConnectorOutputFormat$ConnectorFileRecordWriter.write(ConnectorOutputFormat.java:107)
    at com.teradata.connector.common.ConnectorOutputFormat$ConnectorFileRecordWriter.write(ConnectorOutputFormat.java:65)
    at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:658)
    at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
    at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
    at com.teradata.connector.common.ConnectorMMapper.map(ConnectorMMapper.java:129)
    at com.teradata.connector.common.ConnectorMMapper.run(ConnectorMMapper.java:117)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)

    at com.teradata.connector.teradata.TeradataBatchInsertOutputFormat$TeradataRecordWriter.write(TeradataBatchInsertOutputFormat.java:151)
    at com.teradata.connector.teradata.TeradataBatchInsertOutputFormat$TeradataRecordWriter.write(TeradataBatchInsertOutputFormat.java:114)
    at com.teradata.connector.common.ConnectorOutputFormat$ConnectorFileRecordWriter.write(ConnectorOutputFormat.java:107)
    at com.teradata.connector.common.ConnectorOutputFormat$ConnectorFileRecordWriter.write(ConnectorOutputFormat.java:65)
    at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:658)
    at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
    at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
    at com.teradata.connector.common.ConnectorMMapper.map(ConnectorMMapper.java:129)
    at com.teradata.connector.common.ConnectorMMapper.run(ConnectorMMapper.java:117)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
1
67618 bytes is definitely more than VARCHAR(50000). Aditionally data from Hive is probably Unicode and then the max size is 32000. - dnoeth
So what can be possible data type for Teradata, which resolves the issue - Koushik Chandra
The max length for a varchar column is 64000, and as @dnoeth points out, 67618 is greater than both 50000 and 64000. Varchar isn't going to work. If you re-defined the target column as clob(70000), you can't still be receiving the error you've got in your post. - Andrew

1 Answers

0
votes

String column in Hive has 67618 characters and you are mapping it with VARCHAR(50000) of Teradata.

So the error is expected.

You should use Clob(70000) for this.

Sqoop Export should work for this.