java在eclipseideYarn上的应用

6yt4nkrj  于 2021-06-02  发布在  Hadoop
关注(0)|答案(1)|浏览(293)

当我试图从eclipse提交关于yarn的spark应用程序时,我遇到了一个问题。我试图提交一个简单的支持向量机程序,但我给出以下错误。我有macbook,如果有人给我详细的答案,我会非常感激的

16/09/17 10:04:19 ERROR SparkContext: Error initializing SparkContext.
java.lang.IllegalStateException: Library directory '.../MyProject/assembly/target/scala-2.11/jars' does not exist; make sure Spark is built.
    at org.apache.spark.launcher.CommandBuilderUtils.checkState(CommandBuilderUtils.java:248)
    at org.apache.spark.launcher.CommandBuilderUtils.findJarsDir(CommandBuilderUtils.java:368)
    at org.apache.spark.launcher.YarnCommandBuilderUtils$.findJarsDir(YarnCommandBuilderUtils.scala:38)
    at org.apache.spark.deploy.yarn.Client.prepareLocalResources(Client.scala:500)
    at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:834)
    at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:167)
    at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:56)
    at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:149)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:500)
    at SVM.main(SVM.java:21)
gkl3eglg

gkl3eglg1#


运行配置-->环境
在eclipse中添加环境变量spark\u home。

相关问题