我们正在使用hortonworks发行版和sqoop1.4.6。我正在尝试使用sqoop export加载一个带有avro文件的oracle表,但由于以下stacktrace而失败
Error: java.io.IOException: Can't export data, please check failed map task logs
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: java.lang.RuntimeException: Can't parse input data: 'Objavro.schema��{"type":"record"'
at HADOOP_CLAIMS.__loadFromFields(HADOOP_CLAIMS.java:208)
at HADOOP_CLAIMS.parse(HADOOP_CLAIMS.java:156)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
... 10 more
Caused by: java.lang.NumberFormatException
at java.math.BigDecimal.<init>(BigDecimal.java:470)
at java.math.BigDecimal.<init>(BigDecimal.java:739)
at HADOOP_CLAIMS.__loadFromFields(HADOOP_CLAIMS.java:205)
看起来它试图使用textexportmapper而不是avroexportmapper。我在版本1.4.5中发现了这个问题https://issues.apache.org/jira/browse/sqoop-1283
知道为什么在sqoop1.4.6中会继续发生这种情况吗?是否有其他组件也需要修补?
暂无答案!
目前还没有任何答案,快来回答吧!