错误:sparkcontext正在关闭

dl5txlt9  于 2021-05-27  发布在  Hadoop
关注(0)|答案(0)|浏览(434)

在执行齐柏林飞艇的spark任务时,我得到了以下错误。这个错误有什么具体的原因吗。

Py4JJavaError: An error occurred while calling o523.showString.
:**org.apache.spark.SparkException: Job 28 cancelled because SparkContext was shut down
    at**org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:837)
    at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:835)
    at scala.collection.mutable.HashSet.foreach(HashSet.scala:78)
    at org.apache.spark.scheduler.DAGScheduler.cleanUpAfterSchedulerStop(DAGScheduler.scala:835)
    at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onStop(DAGScheduler.scala:1890)
    at org.apache.spark.util.EventLoop.stop(EventLoop.scala:83)
    at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:1803)
    at org.apache.spark.SparkContext$$anonfun$stop$8.apply$mcV$sp(SparkContext.scala:1931)
    at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1361)
    at org.apache.spark.SparkContext.stop(SparkContext.scala:1930)
...

谢谢nishank

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题