如何修复Flink管道NoClassDefFoundError:“org/apache/hadoop/conf/Configuration”

zzwlnbp8  于 2023-04-10  发布在  Apache
关注(0)|答案(1)|浏览(410)

我实现了一个Apache Flink管道,得到了以下错误:

NoClassDefFoundError : "org/apache/hadoop/conf/Configuration"

这是否意味着我缺少了一个依赖项,或者这是一个类加载问题?
Flink-Docs:Debugging_Classloading

hc2pp10m

hc2pp10m1#

如果你正在运行一个独立的管道(没有hadoop安装,通常是在本地运行的情况下),你需要在代码中提供缺少的类。你可以添加hadoop-common来提供它们。在某些情况下,你还需要额外的flink-parquet依赖项,所以很流行包含hadoop-client。如果你遇到问题,请参阅StackOverflow

<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-common -->
        <!-- Exclusions based on Flink source -->
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>3.3.5</version>
            <exclusions>
                <exclusion>
                    <groupId>com.google.protobuf</groupId>
                    <artifactId>protobuf-java</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>ch.qos.reload4j</groupId>
                    <artifactId>reload4j</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.slf4j</groupId>
                    <artifactId>slf4j-reload4j</artifactId>
                </exclusion>
            </exclusions>
        </dependency>

或者:

<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-client -->
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-client</artifactId>
            <version>3.3.5</version>
        </dependency>

你可能还需要这个:

<!-- https://mvnrepository.com/artifact/org.apache.flink/flink-hadoop-compatibility -->
        <dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-hadoop-compatibility_${scala.binary.version}</artifactId>
            <version>${flink.version}</version>
        </dependency>

否则,如果您有一个应该提供类的Hadoop安装,请确保在您的环境中正确设置了HADOOP_CLASSPATH

相关问题