java—将hdfs文件添加到配置单元并在udf中使用时,会出现错误

t1rydlwq  于 2021-05-27  发布在  Hadoop
关注(0)|答案(0)|浏览(244)

有一个文件名为 file.txt 存储我想在我的自定义项中使用的配置,如下面的代码所示。

public class ParseTraceIdUDF extends GenericUDF {
    private transient ObjectInspector[] argumentOIs;
    public static String dataFile = "file.txt";
    public static final String SEP = "-";
    public static HashSet<String> targetTraces = new HashSet<String>();
    public static HashSet<String> targets = new HashSet<String>();     

    public void ReadFile() {
        try (
                FileInputStream fis = new FileInputStream(dataFile);
                InputStreamReader isr = new InputStreamReader(fis, StandardCharsets.UTF_8);
        ) {
            BufferedReader br = new BufferedReader(isr);
            String line;
            String[] tmp;
            int length;

            while ((line = br.readLine()) != null) {
                line = line.trim();
                targetTraces.add(line);

                tmp = line.split(SEP);
                length = tmp.length;
                targets.addAll(Arrays.asList(tmp).subList(0, length - 1));
            }
        } catch (IOException e) {
            e.printStackTrace();
        }
    }

    public ObjectInspector initialize(ObjectInspector[] args)
            throws UDFArgumentException {
        if (args.length > 2)
            throw new UDFArgumentLengthException("The operator accepts at most 2 arguments.");
        ReadFile();
        argumentOIs = args;
        return PrimitiveObjectInspectorFactory.javaStringObjectInspector;
    }
    ...
}

当我把文件放在本地文件系统中并像这样在配置单元中写入时,它运行得很好

ADD JAR /***/***/rd.jar;
ADD /**/**/file.txt;
CREATE TEMPORARY FUNCTION parse_trace_id AS 'com.hive.udf.PTIdUDF';
...

但是当我想在hdfs上使用这样的配置时,就会出错

ADD JAR /***/***/rd.jar;
ADD FILE hdfs://**/**/file.txt;
CREATE TEMPORARY FUNCTION parse_trace_id AS 'com.hive.udf.PTIdUDF';
...

错误日志如下,我想知道是什么原因。如果我能得到帮助,我将不胜感激

java.io.FileNotFoundException: file.txt (No such file or directory)
    at java.io.FileInputStream.open0(Native Method)
    at java.io.FileInputStream.open(FileInputStream.java:195)
    at java.io.FileInputStream.<init>(FileInputStream.java:138)
    at java.io.FileInputStream.<init>(FileInputStream.java:93)
    at com.vivo.hive.udf.ParseTraceIdUDF.ReadFile(ParseTraceIdUDF.java:35)
    at com.vivo.hive.udf.ParseTraceIdUDF.initialize(ParseTraceIdUDF.java:71)
    at org.apache.hadoop.hive.ql.udf.generic.GenericUDF.initializeAndFoldConstants(GenericUDF.java:145)
    at org.apache.hadoop.hive.ql.plan.ExprNodeGenericFuncDesc.newInstance(ExprNodeGenericFuncDesc.java:233)
    at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.getXpathOrFuncExprNodeDesc(TypeCheckProcFactory.java:959)
    at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory$DefaultExprProcessor.process(TypeCheckProcFactory.java:1176)
    at org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:90)
    at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatchAndReturn(DefaultGraphWalker.java:94)
    at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:78)
    at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker.java:132)
    at org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:109)
    at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory.genExprNode(TypeCheckProcFactory.java:193)
    at org.apache.hadoop.hive.ql.parse.TypeCheckProcFactory.genExprNode(TypeCheckProcFactory.java:146)
    at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genAllExprNodeDesc(SemanticAnalyzer.java:10621)
    at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genExprNodeDesc(SemanticAnalyzer.java:10577)
    at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:3874)
    at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genSelectPlan(SemanticAnalyzer.java:3653)
    at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPostGroupByBodyPlan(SemanticAnalyzer.java:9029)
    at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:8984)
    at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:9851)
    at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:9744)
    at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genOPTree(SemanticAnalyzer.java:10217)
    at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:10228)
    at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:10108)
    at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:223)
    at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:558)
    at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1356)
    at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1473)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1285)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1275)
    at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:226)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:175)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:389)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:324)
    at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:726)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:699)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:634)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
FAILED: NullPointerException null

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题