找不到“spark-submit2.cmd”

lp0sw83n  于 2021-05-18  发布在  Spark
关注(0)|答案(1)|浏览(559)
> library('BBmisc')
> library('sparklyr')
> sc <- spark_connect(master = 'local')
Error in start_shell(master = master, spark_home = spark_home, spark_version = version,  : 
  Failed to find 'spark-submit2.cmd' under 'C:\Users\Owner\AppData\Local\spark\spark-3.0.0-bin-hadoop2.7', please verify - SPARK_HOME.
> spark_home_dir()
[1] "C:\\Users\\Owner\\AppData\\Local/spark/spark-3.0.0-bin-hadoop2.7"
> spark_installed_versions()
  spark hadoop                                                              dir
1 3.0.0    2.7 C:\\Users\\Owner\\AppData\\Local/spark/spark-3.0.0-bin-hadoop2.7
> spark_home_set()
Setting SPARK_HOME environment variable to C:\Users\Owner\AppData\Local/spark/spark-3.0.0-bin-hadoop2.7
> sc <- spark_connect(master = 'local')
Error in start_shell(master = master, spark_home = spark_home, spark_version = version,  : 
  Failed to find 'spark-submit2.cmd' under 'C:\Users\Owner\AppData\Local\spark\spark-3.0.0-bin-hadoop2.7', please verify - SPARK_HOME.

资料来源:https://github.com/englianhu/binary.com-interview-question/issues/1#issue-733943885
我能知道怎么解决吗 Failed to find 'spark-submit2.cmd' under 'C:\Users\Owner\AppData\Local\spark\spark-3.0.0-bin-hadoop2.7', please verify - SPARK_HOME. ?
参考:需要帮助开始使用spark和SparkyR吗

pdkcd3nj

pdkcd3nj1#

解决了的!!!
步骤:
https://spark.apache.org/downloads.html
将压缩文件解压缩到 'C:/Users/scibr/AppData/Local/spark/spark-3.0.1-bin-hadoop3.2' .
手动选择最新版本: spark_home_set('C:/Users/scibr/AppData/Local/spark/spark-3.0.1-bin-hadoop3.2') github源:https://github.com/englianhu/binary.com-interview-question/issues/1#event-3968919946

相关问题