hadoop依赖于两个不同版本的beanutils

8ehkhllq  于 2021-06-03  发布在  Hadoop
关注(0)|答案(1)|浏览(450)

Hadoop2.4.0依赖于两个不同版本的beanutils,导致以下错误 sbt-assembly :

[error] (*:assembly) deduplicate: different file contents found in the following:
[error] .ivy2/cache/commons-beanutils/commons-beanutils/jars/commons-beanutils-1.7.0.jar:org/apache/commons/beanutils/BasicDynaBean.class
[error] .ivy2/cache/commons-beanutils/commons-beanutils-core/jars/commons-beanutils-core-1.8.0.jar:org/apache/commons/beanutils/BasicDynaBean.class

这两个依赖项都可以从hadoop2.4.0进行转换,这可以通过使用如何直接访问ivy(即访问依赖项报告或执行ivy命令)来确认?
如何制作包含hadoop2.4.0的sbt程序集?
更新:根据请求,以下是build.sbt依赖项:

libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "2.4.0"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.0.0"  % "provided" exclude("org.apache.hadoop", "hadoop-client")

resolvers += "Akka Repository" at "http://repo.akka.io/releases/"

libraryDependencies += "com.amazonaws" % "aws-java-sdk" % "1.7.8"

libraryDependencies += "commons-io" % "commons-io" % "2.4"

libraryDependencies += "javax.servlet" % "javax.servlet-api" % "3.0.1" % "provided"

libraryDependencies += "com.sksamuel.elastic4s" %% "elastic4s" % "1.1.1.0"

这个 exclude hadoop 之所以需要,是因为spark包含了hadoop1,它与hadoop2冲突。

wkyowqbh

wkyowqbh1#

尝试将合并策略添加到build.sbt
就像下面一样

val meta = """META.INF(.)*""".r

mergeStrategy in assembly <<= (mergeStrategy in assembly) { (old) =>
  {
    case PathList("javax", "servlet", xs @ _*) => MergeStrategy.last
    case PathList("javax", "activation", xs @ _*) => MergeStrategy.last
    case PathList("org", "apache", xs @ _*) => MergeStrategy.last
    case PathList("com", "esotericsoftware", xs @ _*) => MergeStrategy.last
    case PathList("plugin.properties") => MergeStrategy.last
    case meta(_) => MergeStrategy.discard
    case x => old(x)
  }
}

相关问题