java Apache Spark和Sping Boot 的依赖关系冲突

wdebmtf2  于 2023-02-07  发布在  Java
关注(0)|答案(2)|浏览(166)

因此,我正在为交易策略构建一个完整的堆栈回测应用程序,目前我考虑使用 Boot 作为服务器,使用ApacheSpark进行数据处理。
我试着在同一个项目中创建Sping Boot 和Apache Spark,但没有运气,我无法解决依赖冲突,我得到了一个错误:

Exception in thread "main" java.lang.NoClassDefFoundError: javax/servlet/Servlet
    at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:223)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:484)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2704)
    at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:953)
    at scala.Option.getOrElse(Option.scala:201)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:947)
    at com.samurai.lab.marketdata.MarketDataApplication.main(MarketDataApplication.java:12)
Caused by: java.lang.ClassNotFoundException: javax.servlet.Servlet
    at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:641)
    at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:188)
    at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:520)
    ... 7 more

如果我添加hadoop dependecise,那么我会得到一个不同的错误:

Exception in thread "main" javax.servlet.UnavailableException: Servlet class org.glassfish.jersey.servlet.ServletContainer is not a javax.servlet.Servlet
    at org.sparkproject.jetty.servlet.ServletHolder.checkServletType(ServletHolder.java:514)
    at org.sparkproject.jetty.servlet.ServletHolder.doStart(ServletHolder.java:386)
    at org.sparkproject.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:73)
    at org.sparkproject.jetty.servlet.ServletHandler.lambda$initialize$0(ServletHandler.java:749)
    at java.base/java.util.stream.SortedOps$SizedRefSortingSink.end(SortedOps.java:357)
    at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:510)
    at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
    at java.base/java.util.stream.StreamSpliterators$WrappingSpliterator.forEachRemaining(StreamSpliterators.java:310)
    at java.base/java.util.stream.Streams$ConcatSpliterator.forEachRemaining(Streams.java:735)
    at java.base/java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:762)
    at org.sparkproject.jetty.servlet.ServletHandler.initialize(ServletHandler.java:774)
    at org.sparkproject.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:379)
    at org.sparkproject.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:916)
    at org.sparkproject.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:288)
    at org.sparkproject.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:73)
    at org.apache.spark.ui.ServerInfo.addHandler(JettyUtils.scala:491)
    at org.apache.spark.ui.SparkUI.$anonfun$attachAllHandler$2(SparkUI.scala:76)
    at org.apache.spark.ui.SparkUI.$anonfun$attachAllHandler$2$adapted(SparkUI.scala:76)
    at scala.collection.IterableOnceOps.foreach(IterableOnce.scala:563)
    at scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:561)
    at scala.collection.AbstractIterable.foreach(Iterable.scala:926)
    at org.apache.spark.ui.SparkUI.$anonfun$attachAllHandler$1(SparkUI.scala:76)
    at org.apache.spark.ui.SparkUI.$anonfun$attachAllHandler$1$adapted(SparkUI.scala:74)
    at scala.Option.foreach(Option.scala:437)
    at org.apache.spark.ui.SparkUI.attachAllHandler(SparkUI.scala:74)
    at org.apache.spark.SparkContext.$anonfun$new$31(SparkContext.scala:648)
    at org.apache.spark.SparkContext.$anonfun$new$31$adapted(SparkContext.scala:648)
    at scala.Option.foreach(Option.scala:437)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:648)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2704)
    at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:953)
    at scala.Option.getOrElse(Option.scala:201)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:947)
    at com.samurai.lab.marketdata.MarketDataApplication.main(MarketDataApplication.java:12)

Process finished with exit code 1

我从spark中排除servlet,如下所示:

<exclusions>
                <exclusion>
                    <groupId>javax.servlet</groupId>
                    <artifactId>javax.servlet-api</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.glassfish</groupId>
                    <artifactId>javax.servlet</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.eclipse.jetty.orbit</groupId>
                    <artifactId>javax.servlet</artifactId>
                </exclusion>
            </exclusions>

这没有帮助
这是我现在使用的一个干净的pom.xml文件:

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <parent>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-parent</artifactId>
        <version>3.0.2</version>
        <relativePath/> <!-- lookup parent from repository -->
    </parent>
    <groupId>com.samurai.lab</groupId>
    <artifactId>market-data</artifactId>
    <version>0.0.1-SNAPSHOT</version>
    <name>market-data</name>
    <description>Demo project for Spring Boot</description>
    <properties>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
        <project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
        <java.version>17</java.version>
        <apache-spark.version>3.3.1</apache-spark.version>
        <hadoop.version>3.3.2</hadoop.version>
    </properties>
    <dependencies>

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-web</artifactId>
            <exclusions>
                <exclusion>
                    <groupId>log4j</groupId>
                    <artifactId>*</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.slf4j</groupId>
                    <artifactId>*</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>org.apache.logging.log4j</groupId>
                    <artifactId>*</artifactId>
                </exclusion>
            </exclusions>
        </dependency>

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-test</artifactId>
            <scope>test</scope>
        </dependency>

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.13</artifactId>
            <version>${apache-spark.version}</version>
        </dependency>

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.13</artifactId>
            <version>${apache-spark.version}</version>
        </dependency>
    </dependencies>

    <build>
        <plugins>
            <plugin>
                <groupId>org.springframework.boot</groupId>
                <artifactId>spring-boot-maven-plugin</artifactId>
            </plugin>
        </plugins>
    </build>

</project>

有什么想法,如何使它的工作和解决冲突?

更新版本1

我把依赖关系作为spark-core的一部分:

mwyxok5s

mwyxok5s1#

如果您使用IntelliJ,并且我没有看到您的javax版本,您可以重新加载所有maven项目,首先,您可以检查此库的项目依赖项

6kkfgxo0

6kkfgxo02#

向依赖项添加作用域并重试;

<dependency>
    <groupId>javax.servlet</groupId>
    <artifactId>javax.servlet-api</artifactId>
    <scope>provided</scope>
</dependency>

或者具有X1 MON 1X范围。

相关问题