Apache Spark 您访问的页面不存在或已被删除提示信息

smdnsysy  于 2022-11-25  发布在  Apache
关注(0)|答案(5)|浏览(226)

通过API调用运行Python Spark应用程序-提交应用程序时-响应-SSH进入Worker失败
我的python应用程序存在于

/root/spark/work/driver-id/wordcount.py

错误可在以下位置找到

/root/spark/work/driver-id/stderr

显示以下错误-

Traceback (most recent call last):
  File "/root/wordcount.py", line 34, in <module>
    main()
  File "/root/wordcount.py", line 18, in main
    sc = SparkContext(conf=conf)
  File "/root/spark/python/lib/pyspark.zip/pyspark/context.py", line 115, in __init__
  File "/root/spark/python/lib/pyspark.zip/pyspark/context.py", line 172, in _do_init
  File "/root/spark/python/lib/pyspark.zip/pyspark/context.py", line 235, in _initialize_context
  File "/root/spark/python/lib/py4j-0.9-src.zip/py4j/java_gateway.py", line 1064, in __call__
  File "/root/spark/python/lib/py4j-0.9-src.zip/py4j/protocol.py", line 308, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.io.FileNotFoundException: File file:/tmp/spark-events does not exist.
  at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:402)
  at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:255)
  at org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:100)
  at org.apache.spark.SparkContext.<init>(SparkContext.scala:549)
  at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:59)
  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
  at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:234)
  at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:381)
  at py4j.Gateway.invoke(Gateway.java:214)
  at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:79)
  at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:68)
  at py4j.GatewayConnection.run(GatewayConnection.java:209)
  at java.lang.Thread.run(Thread.java:745)

它指示- /tmp/spark-events不存在-这是真的。但是,在www.example.com中wordcount.py

from pyspark import SparkContext, SparkConf

... few more lines ...

def main():
    conf = SparkConf().setAppName("MyApp").setMaster("spark://ec2-54-209-108-127.compute-1.amazonaws.com:7077")
    sc = SparkContext(conf=conf)
    sc.stop()

if __name__ == "__main__":
    main()
x7rlezfr

x7rlezfr1#

/tmp/spark-events是Spark存储事件日志的位置,只需在主机上创建这个目录即可。

$mkdir /tmp/spark-events
$ sudo /root/spark-ec2/copy-dir /tmp/spark-events/
RSYNC'ing /tmp/spark-events to slaves...
ec2-54-175-163-32.compute-1.amazonaws.com
r6l8ljro

r6l8ljro2#

当我尝试在我的本地机器上设置我的spark历史服务器时,我遇到了同样的“File file:/tmp/spark-events does not exist.”错误。我已经将我的日志目录自定义为一个非默认路径。要解决这个问题,我需要做两件事。
1.编辑$SPARK_HOME/conf/spark-defaults.conf --添加以下两行spark.history.fs.logDirectory /mycustomdir spark.eventLog.enabled true
1.创建从/tmp/spark-events到/mycustomdir链接。
ln -fs /tmp/spark-events /mycustomdir理想情况下,第一步可以完全解决我的问题,但我仍然需要创建链接,所以我怀疑可能有一个其他设置我错过了.无论如何,一旦我这样做,我就可以运行我的historyserver,并看到新的工作记录在我的webui.

zhte4eai

zhte4eai3#

使用客户端/驱动程序的spark.eventLog.dir

spark.eventLog.dir=/usr/local/spark/history

并将spark.history.fs.logDirectory用于历史服务器

spark.history.fs.logDirectory=/usr/local/spark/history

如以下文献中所述:How to enable spark-history server for standalone cluster non hdfs mode
至少符合Spark版本2.2.1

nx7onnlm

nx7onnlm4#

我只是在{master}节点上创建了/tmp/spark-events,然后将其分发到集群上的其他节点上工作。

mkdir /tmp/spark-events
rsync -a /tmp/spark-events {slaves}:/tmp/spark-events

我spark-default.conf文件:

spark.history.ui.port=18080
spark.eventLog.enabled=true
spark.history.fs.logDirectory=hdfs:///home/elon/spark/events
jpfvwuh4

jpfvwuh45#

当我尝试编辑两个文件spark-default.conf``spark_env.sh时,历史服务器启动。
spark-default.conf:

spark.eventLog.enabled           true
spark.history.ui.port=18080
spark.history.fs.logDirectory={host}:{port}/directory

spark_env.sh

export SPARK_HISTORY_OPTS="
-Dspark.history.ui.port=18080
-Dspark.history.fs.logDirectory={host}:{port}/directory
-Dspark.history.retainedApplications=30"

相关问题