无法获取二进制文件

vsikbqxv  于 2021-06-21  发布在  Mesos
关注(0)|答案(1)|浏览(446)

试图在mesos星团上运行一个spark任务。尝试获取二进制文件时出错。
尝试打开二进制文件:
高密度光纤
从机上的本地文件系统。
用于以下路径 SPARK_EXECUTOR_URI 文件系统路径- file://home/labadmin/spark-1.2.1.tgz ```
I0501 10:27:19.302435 30510 fetcher.cpp:214] Fetching URI 'file://home/labadmin/spark-1.2.1.tgz'
Failed to fetch: file://home/labadmin/spark-1.2.1.tgz
Failed to synchronize with slave (it's probably exited)

没有端口的hdfs路径- `hdfs://ipaddress/spark/spark-1.2.1.tgz` ```
0427 09:23:21.616092  4842 fetcher.cpp:214] Fetching URI 'hdfs://ipaddress/spark/spark-1.2.1.tgz'
E0427 09:23:24.710765  4842 fetcher.cpp:113] HDFS copyToLocal failed: /usr/lib/hadoop/bin/hadoop fs -copyToLocal 'hdfs://ipaddress/spark/spark-1.2.1.tgz' '/tmp/mesos/slaves/20150427-054938-2933394698-5050-1030-S0/frameworks/20150427-054938-2933394698-5050-1030-0002/executors/20150427-054938-2933394698-5050-1030-S0/runs/5c13004a-3d8c-40a4-bac4-9c07249e1923/spark-1.2.1.tgz'
copyToLocal: Call From sclq174.lss.emc.com/ipaddress to sclq174.lss.emc.com:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused

带端口50070的hdfs路径- hdfs://ipaddress:50070/spark/spark-1.2.1.tgz ```
I0427 13:34:25.295554 16633 fetcher.cpp:214] Fetching URI 'hdfs://ipaddress:50070/spark/spark-1.2.1.tgz'
E0427 13:34:28.438596 16633 fetcher.cpp:113] HDFS copyToLocal failed: /usr/lib/hadoop/bin/hadoop fs -copyToLocal 'hdfs://ipaddress:50070/spark/spark-1.2.1.tgz' '/tmp/mesos/slaves/20150427-054938-2933394698-5050-1030-S0/frameworks/20150427-054938-2933394698-5050-1030-0008/executors/20150427-054938-2933394698-5050-1030-S0/runs/2fc7886a-cfff-4cb2-b2f6-25988ca0f8e3/spark-1.2.1.tgz'
copyToLocal: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.; Host Details : local host is:

你知道为什么它不起作用吗?
kb5ga3dv

kb5ga3dv1#

spark支持不同的获取二进制文件的方法: file: -绝对路径和 file:/ uri由驱动程序的http文件服务器提供服务,每个执行器从驱动程序http服务器提取文件。 hdfs: , http: , https: , ftp: -这些命令按预期从uri中下拉文件和jar local: -以开头的uri local:/ 应作为本地文件存在于每个工作节点上 file://home/labadmin/spark-1.2.1.tgz 无法从驱动程序访问。你可能想用 local:/ uri。
可能没有运行hdfs服务器 sclq174.lss.emc.com:8020 hadoop无法识别uri格式,您应该用实际的ip地址替换hostname以使其工作。 192.168.1.1:50070 .

相关问题