当我开始我的spark-shell
时,我收到一堆WARN
消息。但我不能理解他们。有什么重要的问题需要我处理吗?或者,有没有我错过的配置?或者这些WARN
消息是正常的。
cliu@cliu-ubuntu:Apache-Spark$ spark-shell
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's repl log4j profile: org/apache/spark/log4j-defaults-repl.properties
To adjust logging level use sc.setLogLevel("INFO")
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 1.5.2
/_/
Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_66)
Type in expressions to have them evaluated.
Type :help for more information.
15/11/30 11:43:54 WARN Utils: Your hostname, cliu-ubuntu resolves to a loopback address: 127.0.1.1; using xxx.xxx.xxx.xx (`here I hide my IP`) instead (on interface wlan0)
15/11/30 11:43:54 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
15/11/30 11:43:55 WARN MetricsSystem: Using default name DAGScheduler for source because spark.app.id is not set.
Spark context available as sc.
15/11/30 11:43:58 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
15/11/30 11:43:58 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
15/11/30 11:44:11 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
15/11/30 11:44:11 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
15/11/30 11:44:14 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
15/11/30 11:44:14 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
15/11/30 11:44:14 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
15/11/30 11:44:27 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
15/11/30 11:44:27 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
SQL context available as sqlContext.
scala>
3条答案
按热度按时间deyfvvtc1#
这一条:
意味着驱动程序设法为自己计算出的主机名是不可路由的,因此不允许远程连接。在您的本地环境中,这不是问题,但如果您选择多机配置,Spark将无法正常工作。因此,警告消息可能是问题,也可能不是问题。只是提个醒。
bis0qfac2#
日志信息绝对正常。在这里,BoneCP尝试绑定到JDBC连接,这就是您收到这些警告的原因。在任何情况下,如果您想要管理日志记录,可以通过将
<spark-path>/conf/log4j.properties.template
文件复制到<spark-path>/conf/log4j.properties
并进行配置来指定日志记录级别。最后,可以在这里找到关于日志级别的类似答案:How to stop messages displaying on spark console?
k0pti3hp3#
补充@Jacek Laskowski关于
SPARK_LOCAL_IP
警告的回答:我在运行在Ubuntu20.04服务器上的独立Spark集群上遇到了相同的Spark-Shell运行。正如预期的那样,将
SPARK_LOCAL_IP
环境变量设置为$(hostname)
使警告消失,但当应用程序运行时没有问题时,无法使用端口4040
访问Worker图形用户界面。为了解决这个问题,我们必须设置
SPARK_LOCAL_HOSTNAME
而不是SPARK_LOCAL_IP
。这样一来,警告就消失了,并且可以通过端口4040访问Worker图形用户界面。我在Spark文档中找不到有关此变量的信息,但根据Spark的源代码,它用于设置自定义本地机器URI:https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/util/Utils.scala#L1058