运行sqoop export时,由于ora-01438:值大于此列允许的指定精度而失败,我们尝试拆分数据并sqoop它&我们可以看到失败的行。同样,如果我们尝试只导出失败的行,它将成功导出。我们把它变成了失败的一行,又是另一行的失败。当我们尝试将sqoop导出到oracle时。所有行---1689105在这两行之间失败,因为值大于此列允许的指定精度。所以添加了行id,发现172200导出成功。同样,当导出第172200和172201行时,它运行成功。所以分裂Hive表高达172200和172200以上。导出172200行时,172177失败。当导出超过172200行时,它在452729中失败。单独检查行时,也无法找到任何数据错误。使用了sqoop语句。
下面是错误日志,
2016-12-07 06:53:14,691 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper:
2016-12-07 06:53:14,691 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper: Exception raised during data export
2016-12-07 06:53:14,691 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper:
2016-12-07 06:53:14,691 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper: Exception:
java.io.IOException: java.sql.SQLDataException: ORA-01438: value larger than specified precision allowed for this column
at org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:233)
at org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:46)
at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:658)
at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:84)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1707)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.sql.SQLDataException: ORA-01438: value larger than specified precision allowed for this column
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:450)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:399)
at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:1017)
at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:655)
at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:249)
at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:566)
at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:215)
at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:58)
at oracle.jdbc.driver.T4CPreparedStatement.executeForRows(T4CPreparedStatement.java:943)
at oracle.jdbc.driver.OraclePreparedStatement.executeForRowsWithTimeout(OraclePreparedStatement.java:10932)
at oracle.jdbc.driver.OraclePreparedStatement.executeBatch(OraclePreparedStatement.java:11043)
at oracle.jdbc.driver.OracleStatementWrapper.executeBatch(OracleStatementWrapper.java:244)
at org.apache.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:231)
2016-12-07 06:53:14,692 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper: On input: 172177JACOB IZBICKI & CO. LTD101726825708\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N10.0\N\NNo PremiumNo Premium\N0.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.02343750.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.00.0\N\N0.00.00.00.00.00.00.01905.00.0No Data2005-05-25 00:00:00.02007-05-26 00:00:00.0\N2.02.0Low0.0Low1.32Low0.0LowNONEIncrease price to drive to profitabilityNULLNULLNULLNULLNULLNULLNULLNULLNULLNULLNULLNULL\N\N\N\N\N\N\N\N\N\N\N\N\N\N0.00.00.00.00.00.00.00.00.00.00.00.00.00.0
2016-12-07 06:53:14,692 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper: On input file: hdfs://dnvdevbigdata2.corp.nai.org:8020/user/hive/warehouse/mcafee_masterdatabase.db/wc_output_clv_detail_mdmparent1/000000_0
2016-12-07 06:53:14,692 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper: At position 82909184
2016-12-07 06:53:14,692 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper:
2016-12-07 06:53:14,692 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper: Currently processing split:
2016-12-07 06:53:14,692 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper: Paths:/user/hive/warehouse/XXXXXX.db/XXXXXXX/000000_0:0+67108864,/user/hive/warehouse/mcafee_masterdatabase.db/wc_output_clv_detail_mdmparent1/000000_0:67108864+15820814,/user/hive/warehouse/XXXXXX.db/XXXXXX/000002_0:0+67108736
2016-12-07 06:53:14,692 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper:
2016-12-07 06:53:14,692 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper: This issue might not necessarily be caused by current input
2016-12-07 06:53:14,692 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper: due to the batching nature of export.
2016-12-07 06:53:14,692 ERROR [main] org.apache.sqoop.mapreduce.TextExportMapper:
2016-12-07 06:53:14,692 INFO [Thread-11] org.apache.sqoop.mapreduce.AutoProgressMapper: Auto-progress thread is finished. keepGoing=false
使用了sqoop语句。
sqoop export --connect jdbc:oracle:thin:@odevbi-XXXXXXX:1521/XXXXX \
-username YYYY -password YYYY123 \
--table TEST_TABLE \
--input-fields-terminated-by '\001' \
--lines-terminated-by '\n' \
-m 1 \
--input-null-non-string "\\\N" \
--input-null-string "\\\N" \
--direct \
--export-dir /user/hive/warehouse/yyyyy.db/xxxxx
暂无答案!
目前还没有任何答案,快来回答吧!