cassandra 线程“main”中出现异常错误在没有活动SparkEnv的情况下,无法检索具有“spark”方案的文件

neskvpey  于 2022-11-05  发布在  Cassandra
关注(0)|答案(1)|浏览(120)

我对spark和cassandra很陌生,从github得到一个示例,并尝试从下面的链接运行应用程序
spark-on-cassandra-quickstart
生成jar文件后,尝试使用以下语法执行

C:\Users\user\Desktop\softwares\spark-2.4.3-bin-hadoop2.7\spark-2.4.3-bin-hadoop2.7\bin>spark-submit --class com.github.boneill42.JavaDemo --master spark://localhost:7077
C:\Users\user\git\spark-on-cassandra-quickstart\target/spark-on-cassandra-0.0.1-SNAPSHOT-jar-with-dependencies.jar spark://localhost:7077 localhost

下面是我面临的问题

19/06/08 22:59:49 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception in thread "main" java.lang.IllegalStateException: Cannot retrieve files with 'spark' scheme without an active SparkEnv.
        at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:690)
        at org.apache.spark.deploy.DependencyUtils$.downloadFile(DependencyUtils.scala:137)
        at org.apache.spark.deploy.SparkSubmit$$anonfun$prepareSubmitEnvironment$7.apply(SparkSubmit.scala:367)
        at org.apache.spark.deploy.SparkSubmit$$anonfun$prepareSubmitEnvironment$7.apply(SparkSubmit.scala:367)
        at scala.Option.map(Option.scala:146)
        at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:366)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:143)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

请帮助我解决此问题

7cwmlq89

7cwmlq891#

在您的情况下,您似乎希望从standalone mode开始

spark://HOST:PORT   Connect to the given Spark standalone cluster master.
The port must be whichever one your master is configured to use, which is 7077 by default.

您是否先启动spark master和worker?
发射指挥官

./sbin/start-master.sh

发射工作人员

./bin/spark-class org.apache.spark.deploy.worker.Worker  spark://localhost:7077 -c 1 -m 512M

在启动主进程和辅助进程后,您可以再次提交作业。

相关问题