0
votes

im trying to export data from hive into an mssql server, i know that sqoop and the sql server are OK as I can export another table without issue.

The error that Im getting is:

14/06/19 14:48:37 INFO mapreduce.Job: Task Id : attempt_1403175168750_0031_m_000003_0, Status : FAILED Error: java.io.IOException: Can't export data, please check failed map task logs at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112) at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163) Caused by: java.lang.RuntimeException: Can't parse input data: 'You may be harboring secret illusions about how you want to li... More for Virgo http://t.co/Jnt91NMNt5' at StageFlumeTweets.__loadFromFields(StageFlumeTweets.java:236) at StageFlumeTweets.parse(StageFlumeTweets.java:174) at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83) ... 10 more Caused by: java.lang.NumberFormatException: For input string: "You may be harboring secret illusions about how you want to li... More for Virgo http://t.co/Jnt91NMNt5" at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65) at java.lang.Long.parseLong(Long.java:441) at java.lang.Long.valueOf(Long.java:540) at StageFlumeTweets.__loadFromFields(StageFlumeTweets.java:228) ... 12 more

The command im using to export the data is:

sqoop export --verbose --connect "jdbc:sqlserver://xx;database=xx;username=xx;password=xx" --export-dir /user/hive/warehouse/xx/twitter_bulk2/ --table StageFlumeTweets --input-fields-terminated-by ','

The file with the data in is just a tweetid and the text like so:

468751929271517185,RT @BestofScorpio: A woman may want you, but she doesnt need you. 468751929565130752,I'm gonna need to borrow someone's red lipstick for this

1

1 Answers

0
votes

You get a NumberFormatException.

It may be an overflow problem (given the size of tweedid). Is the corresponding field in Sql Server declared as bigint?