spark-2.3提交不适用于windows 10-64位的一主一从配置

3hvapo4f  于 2021-05-31  发布在  Hadoop
关注(0)|答案(0)|浏览(277)

hadoop hdfs和spark:-
1) 环境变量-
hadoop\ conf\ dir-f:\spark\hadoop2\hadoop-2.7.6\etc\hadoop
hadoop\u home-f:\spark\hadoop2\hadoop-2.7.6
java\u主页-f:\java\jdk1.8.0\u 121
spark\u主页-f:\spark\spark-2.3.0-bin-hadoop2.7
路径:
f:\spark\hadoop2\hadoop-2.7.6\bin文件
f:\spark\spark-2.3.0-bin-hadoop2.7\bin
f:\spark\hadoop2\hadoop-2.7.6\lib\native本机
2) core-site.xml文件

3) hdfs-site.xml文件

4) mapred-site.xml文件

启动主机的命令-
spark类org.apache.spark.deploy.master.master
启动一个工人的命令-
spark类org.apache.spark.deploy.worker.workerspark://192.168.0.12:7077
提交作业
spark submit--class org.apache.spark.examples.sparkpi--masterspark://192.168.0.12:7077 f:/spark/spark-2.3.0-bin-hadoop2.7/examples/jars/spark-examples\u 2.11-2.3.0.jar
错误日志-
线程“main”java.io.ioexception中的异常:org.apache.hadoop.fs.filesystem.getfilesystemclass(filesystem)中scheme:f没有文件系统。java:2660)在org.apache.hadoop.fs.filesystem.createfilesystem(filesystem。java:2667)在org.apache.hadoop.fs.filesystem.access$200(文件系统)。java:94)在org.apache.hadoop.fs.filesystem$cache.getinternal(文件系统)。java:2703)在org.apache.hadoop.fs.filesystem$cache.get(filesystem。java:2685)在org.apache.hadoop.fs.filesystem.get(filesystem。java:373)在org.apache.spark.util.utils$.gethadoopfilesystem(utils。scala:1893)在org.apache.spark.util.utils$.dofetchfile(utils。scala:690)在org.apache.spark.deploy.dependencyutils$.downloadfile(dependencyutils。scala:131)在org.apache.spark.deploy.sparksubmit$$anonfun$preparesubmitenvironment$7.apply(sparksubmit。scala:401)在org.apache.spark.deploy.sparksubmit$$anonfun$preparesubmitenvironment$7.apply(sparksubmit。scala:401)在scala.option.map(option。scala:146)在org.apache.spark.deploy.sparksubmit$.preparesubmitenvironment(sparksubmit。scala:400)在org.apache.spark.deploy.sparksubmit$.submit(sparksubmit。scala:170)位于org.apache.spark.deploy.sparksubmit$.main(sparksubmit.com)。scala:136)位于org.apache.spark.deploy.sparksubmit.main(sparksubmit.scala)
我无法解决这个问题,即使尝试了几个博客。

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题