avro:java.lang.runtimeexception:记录中不支持的类型

h5qlskok  于 2021-06-25  发布在  Pig
关注(0)|答案(1)|浏览(476)

输入:test.csv

100
101
102

Pig脚本:

REGISTER required jars are registered;

A = LOAD 'test.csv'  USING org.apache.pig.piggybank.storage.CSVExcelStorage() AS (code:chararray);

STORE A INTO 'test' USING org.apache.pig.piggybank.storage.avro.AvroStorage
    ('schema',
    '{"namespace":"com.pig.test.avro","type":"record","name":"Avro_Test","doc":"Avro Test Schema",
        "fields":[
            {"name":"code","type":["string","null"],"default":null}
            ]}'
    );

存储时出现运行时错误。解决相同问题的任何输入。
错误日志:

ERROR org.apache.pig.tools.pigstats.SimplePigStats - ERROR 2997: Unable to recreate exception from backed error: org.apache.avro.file.DataFileWriter$AppendWriteException: java.lang.RuntimeException: Unsupported type in record:class java.lang.String
at org.apache.avro.file.DataFileWriter.append(DataFileWriter.java:263)
at org.apache.pig.piggybank.storage.avro.PigAvroRecordWriter.write(PigAvroRecordWriter.java:49)
at org.apache.pig.piggybank.storage.avro.AvroStorage.putNext(AvroStorage.java:722)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat$PigRecordWriter.write(PigOutputFormat.java:139)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat$PigRecordWriter.write(PigOutputFormat.java:98)
at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:558)
at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:85)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:106)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigMapOnly$Map.collect(PigMap
2015-06-02 23:06:03,934 [main] ERROR org.apache.pig.tools.pigstats.PigStatsUtil - 1 map reduce job(s) failed!  
2015-06-02 23:06:03,934 [main] INFO  org.apache.pig.tools.pigstats.SimplePigStats - Script Statistics:
ars1skjm

ars1skjm1#

看起来这是个错误:https://issues.apache.org/jira/browse/pig-3358
如果可以,尝试更新到pig0.14,根据评论这已经被修复了。

相关问题