使用maven构建的java spark 1.2在com.google.common包中出错

6za6bjd0  于 2021-05-30  发布在  Hadoop
关注(0)|答案(3)|浏览(330)

森托斯
6.2
hadoop软件
2.6.0
斯卡拉
2.10.5
java版本
“1.7.0\u 75”openjdk运行时环境(rhel-2.5.4.0.el6\u 6-x86\u 64 u75-b13)
openjdk 64位服务器虚拟机(内部版本24.75-b04,混合模式)
mvn版本
apache maven 3.3.1(cab6659f9874fa96462afef40fcf6bc033d58c1c;2015-03-13t21:10:27+01:00)maven主页:/opt/maven
java版本:1.7.0_,供应商:oracle corporation
java主页:/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.75.x86\u 64/jre
默认区域设置:en\u us,平台编码:utf-8
操作系统名称:“linux”,版本:“2.6.32-220.el6.x86\u 64”,架构:“amd64”,系列:“unix”
环境变量

export SCALA_HOME=/opt/scala
export JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.75.x86_64
export JRE_HOME=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.75.x86_64/jre
export HADOOP_HOME=/home/tom/hadoop
export SPARK_HOME=/home/tom/spark
export HADOOP_INSTALL=$HADOOP_HOME
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME
export HADOOP_YARN_HOME=$HADOOP_HOME
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib
export PATH=$PATH:$JAVA_HOME/bin:$JRE_HOME/bin:$HADOOP_HOME/sbin:$HADOOP_HOME/bin:$SPARK_HOME/bin:$MAVEN_HOME/bin:$SCALA_HOME/bin
export MAVEN_HOME=/opt/maven

export SPARK_EXAMPLES_JAR=$SPARK_HOME/spark-0.7.2/examples/target/scala-2.9.3/spark-examples_2.9.3-0.7.2.jar
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export YARN_CONF_DIR=$HADOOP_HOME/etc/hadoop
export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=$HADOOP_HOME/lib/"

build命令
mvn-pyarn-phadoop-2.4-dhadoop.version=2.6.0-phive-phive-0.12.0-phive-thriftserver-dskiptests干净包
错误消息

[ERROR] /home/tom/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeBatchFetcher.scala:22: object Throwables is not a member of package com.google.common.base
[ERROR] import com.google.common.base.Throwables
[ERROR]        ^
[ERROR] /home/tom/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeBatchFetcher.scala:59: not found: value Throwables
[ERROR]           Throwables.getRootCause(e) match {
[ERROR]           ^
[ERROR] /home/tom/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumePollingInputDStream.scala:26: object util is not a member of package com.google.common
[ERROR] import com.google.common.util.concurrent.ThreadFactoryBuilder
[ERROR]                          ^
[ERROR] /home/tom/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumePollingInputDStream.scala:69: not found: type ThreadFactoryBuilder
[ERROR]     Executors.newCachedThreadPool(new ThreadFactoryBuilder().setDaemon(true).
[ERROR]                                       ^
[ERROR] /home/tom/spark/external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumePollingInputDStream.scala:76: not found: type ThreadFactoryBuilder
[ERROR]     new ThreadFactoryBuilder().setDaemon(true).setNameFormat("Flume Receiver Thread - %d").build())
[ERROR]         ^
[ERROR] 5 errors found

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Spark Project Parent POM ........................... SUCCESS [ 10.121 s]
[INFO] Spark Project Networking ........................... SUCCESS [ 14.957 s]
[INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ 10.858 s]
[INFO] Spark Project Core ................................. SUCCESS [07:33 min]
[INFO] Spark Project Bagel ................................ SUCCESS [ 52.312 s]
[INFO] Spark Project GraphX ............................... SUCCESS [02:19 min]
[INFO] Spark Project Streaming ............................ SUCCESS [03:28 min]
[INFO] Spark Project Catalyst ............................. SUCCESS [03:18 min]
[INFO] Spark Project SQL .................................. SUCCESS [03:48 min]
[INFO] Spark Project ML Library ........................... SUCCESS [03:40 min]
[INFO] Spark Project Tools ................................ SUCCESS [ 29.380 s]
[INFO] Spark Project Hive ................................. SUCCESS [02:53 min]
[INFO] Spark Project REPL ................................. SUCCESS [01:32 min]
[INFO] Spark Project YARN Parent POM ...................... SUCCESS [  5.124 s]
[INFO] Spark Project YARN Stable API ...................... SUCCESS [01:34 min]
[INFO] Spark Project Hive Thrift Server ................... SUCCESS [ 56.404 s]
[INFO] Spark Project Assembly ............................. SUCCESS [01:11 min]
[INFO] Spark Project External Twitter ..................... SUCCESS [ 36.661 s]
[INFO] Spark Project External Flume Sink .................. SUCCESS [ 50.006 s]
[INFO] Spark Project External Flume ....................... FAILURE [ 14.287 s]
[INFO] Spark Project External MQTT ........................ SKIPPED
[INFO] Spark Project External ZeroMQ ...................... SKIPPED
[INFO] Spark Project External Kafka ....................... SKIPPED
[INFO] Spark Project Examples ............................. SKIPPED
[INFO] Spark Project YARN Shuffle Service ................. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 36:02 min
[INFO] Finished at: 2015-04-04T03:58:19+02:00
[INFO] Final Memory: 60M/330M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile (scala-compile-first) on project spark-streaming-flume_2.10: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile failed. CompileFailed -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:

我怀疑这是一些问题,但我不能搞清楚。有人能帮我吗?

ipakzgxi

ipakzgxi1#

我今天也有类似的问题。这个 Spark Project External Flume ....................... FAILURE 日志让我很烦,但我想是的 git clean -xdf 这很有帮助。如果还不够的话,也试试 git clean -Xdf . 跑步 mvn ... 再一次。祝你好运!

4si2a6ki

4si2a6ki2#

如果你可以作弊,那么你可以跳过那些编译失败的模块
spark-streaming-flume 2.10和spark-streaming-kafka 2.10
以下命令用于编译spark包,该包的配置单元支持cdh5.3.3和spark 1.2.0的spark sql。
mvn-pyarn-dhadoop.version=2.5.0-cdh5.3.3-dskiptests-phive-phive-thriftserver-pl'!org.apache。spark:spark-streaming-flume_2.10,!org.apache。spark:spark-streaming-kafka_2.10'包

6ovsh4lw

6ovsh4lw3#

我也遇到过同样的问题 Apache Spark 1.2.1 用如下命令建造时-

mvn -e -Pyarn -Phadoop-2.6 -Phive -Phive-thriftserver -DskipTests clean package

apachemaven的版本似乎在这里扮演着角色。在失败的情况下,maven版本是-

mvn-版本

Apache Maven**3.3.3**(7994120775791599e205a5524ec3e0dfe41d4a06; 2015-04-22T07:57:37-04:00)
Maven home: /opt/Spark/amit/apache-maven/apache-maven-3.3.3
Java version: 1.8.0, vendor: IBM Corporation
Java home: /opt/Spark/amit/ibmjava8sdk/sdk/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "3.14.8-200.fc20.x86_64", arch: "amd64", family: "unix"

当我尝试使用老maven时,构建是成功的。使用ApacheMaven3.2.x似乎解决了这个问题。我用过-

mvn-版本

Apache Maven**3.2.5**(12a6b3acb947671f09b81f49094c53f426d8cea1; 2014-12-14T12:29:23-05:00)
Maven home: /opt/Spark/amit/apache-maven/apache-maven-3.2.5
Java version: 1.8.0, vendor: IBM Corporation
Java home: /opt/Spark/amit/ibmjava8sdk/sdk/jre
Default locale: en_US, platform encoding: UTF-8
OS name: "linux", version: "3.14.8-200.fc20.x86_64", arch: "amd64", family: "unix"

希望这有帮助。
谢谢,阿米特

相关问题