sqoop:java.lang.double不能转换为java.nio.bytebuffer

polhcujo  于 2021-07-13  发布在  Hadoop
关注(0)|答案(1)|浏览(451)

我试图将一个表从oracle导入到hive,我不断地得到这个错误。
我正在执行:
sqoop导入-dmapreduce.job.queuename=--连接jdbc:oracle::@//--用户名x--密码文件=x--查询“select description,zona from base.test”--mapreduce作业名jobsqoop test--目标目录/data/user/hive/warehouse/base.db/test--按zona拆分--Map列java“zona=double,description=string“--删除目标目录--as parquetfile--compression codec=snappy--null字符串'\n'--null非字符串'\n'--num mappers 1--hive import--hive overwrite--hive database--hive table test--direct

Error: java.lang.ClassCastException: java.lang.Double cannot be cast to java.nio.ByteBuffer
        at org.apache.parquet.avro.AvroWriteSupport.writeValueWithoutConversion(AvroWriteSupport.java:338)
        at org.apache.parquet.avro.AvroWriteSupport.writeValue(AvroWriteSupport.java:271)
        at org.apache.parquet.avro.AvroWriteSupport.writeRecordFields(AvroWriteSupport.java:187)
        at org.apache.parquet.avro.AvroWriteSupport.write(AvroWriteSupport.java:161)
        at org.apache.parquet.hadoop.InternalParquetRecordWriter.write(InternalParquetRecordWriter.java:123)
        at org.apache.parquet.hadoop.ParquetRecordWriter.write(ParquetRecordWriter.java:179)
        at org.apache.parquet.hadoop.ParquetRecordWriter.write(ParquetRecordWriter.java:46)
        at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:670)
        at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
        at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
        at org.apache.sqoop.mapreduce.parquet.hadoop.HadoopParquetImportMapper.write(HadoopParquetImportMapper.java:61)
        at org.apache.sqoop.mapreduce.ParquetImportMapper.map(ParquetImportMapper.java:72)
        at org.apache.sqoop.mapreduce.ParquetImportMapper.map(ParquetImportMapper.java:38)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)
        at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:799)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)

有什么想法吗?
thnx公司

q7solyqu

q7solyqu1#

它使用
dsqoop.parquet.logical\u types.decimal.enable=false

相关问题