spark提交不使用yarn

42fyovps  于 2021-06-02  发布在  Hadoop
关注(0)|答案(2)|浏览(380)

我用yarn建立了一个5节点hadoop集群,spark也建立在所有5节点上。我使用的是spark-1.5.0-cdh5.5.0
当我跑的时候 spark-shell --master yarn --num-executors 3 这将按预期启动shell,并使用yarn从rm获取资源。所以,我猜spark会像预期的那样使用hadoop conf文件。但当我这么做的时候

spark-submit word_count.py --master yarn-cluster --num-executors 3

这是试图连接到Spark大师,这是相信是不需要的时候运行的Yarn。以下错误:

16/11/08 00:18:31 INFO util.Utils: Successfully started service 'HTTP file server' on port 47990.
16/11/08 00:18:31 INFO spark.SparkEnv: Registering OutputCommitCoordinator
16/11/08 00:18:41 INFO server.Server: jetty-8.y.z-SNAPSHOT
16/11/08 00:18:41 INFO server.AbstractConnector: Started SelectChannelConnector@0.0.0.0:4040
16/11/08 00:18:41 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
16/11/08 00:18:41 INFO ui.SparkUI: Started SparkUI at http://10.0.0.4:4040
16/11/08 00:18:41 INFO util.Utils: Copying /home/rshaik26/word_count.py to /tmp/spark-0a5348f8-5ba8-4906-89af-7499054b554e/userFiles-287b5d13-123a-4bd6-9fe3-489af2a502a1/word_count.py
16/11/08 00:18:41 INFO spark.SparkContext: Added file file:/home/rshaik26/word_count.py at http://10.0.0.4:47990/files/word_count.py with timestamp 1478544521986
16/11/08 00:18:42 WARN metrics.MetricsSystem: Using default name DAGScheduler for source because spark.app.id is not set.
16/11/08 00:18:42 INFO client.AppClient$ClientEndpoint: Connecting to master spark://ubuntuhdp2:7077...
16/11/08 00:18:42 WARN client.AppClient$ClientEndpoint: Could not connect to ubuntuhdp2:7077: akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]
16/11/08 00:18:42 ERROR akka.ErrorMonitor: AssociationError [akka.tcp://sparkDriver@10.0.0.4:53411] -> [akka.tcp://sparkMaster@ubuntuhdp2:7077]: Error [Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]] [
akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]
Caused by: akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2: Connection refused: ubuntuhdp2/10.0.0.4:7077
]
akka.event.Logging$Error$NoCause$
16/11/08 00:18:42 WARN client.AppClient$ClientEndpoint: Could not connect to ubuntuhdp2:7077: akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]
16/11/08 00:18:42 ERROR akka.ErrorMonitor: AssociationError [akka.tcp://sparkDriver@10.0.0.4:53411] -> [akka.tcp://sparkMaster@ubuntuhdp2:7077]: Error [Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]] [
akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]
Caused by: akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2: Connection refused: ubuntuhdp2/10.0.0.4:7077
]
akka.event.Logging$Error$NoCause$
16/11/08 00:18:42 WARN client.AppClient$ClientEndpoint: Could not connect to ubuntuhdp2:7077: akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]
16/11/08 00:18:42 ERROR akka.ErrorMonitor: AssociationError [akka.tcp://sparkDriver@10.0.0.4:53411] -> [akka.tcp://sparkMaster@ubuntuhdp2:7077]: Error [Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]] [
akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]
Caused by: akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2: Connection refused: ubuntuhdp2/10.0.0.4:7077
]
akka.event.Logging$Error$NoCause$
16/11/08 00:18:42 WARN client.AppClient$ClientEndpoint: Could not connect to ubuntuhdp2:7077: akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]
16/11/08 00:18:42 ERROR akka.ErrorMonitor: AssociationError [akka.tcp://sparkDriver@10.0.0.4:53411] -> [akka.tcp://sparkMaster@ubuntuhdp2:7077]: Error [Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]] [
akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]
Caused by: akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2: Connection refused: ubuntuhdp2/10.0.0.4:7077
]
akka.event.Logging$Error$NoCause$
16/11/08 00:18:42 WARN client.AppClient$ClientEndpoint: Failed to connect to master ubuntuhdp2:7077
akka.actor.ActorNotFound: Actor not found for: ActorSelection[Actor[akka.tcp://sparkMaster@ubuntuhdp2:7077/]/user/Master]
    at akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:66)
    at akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:64)
    at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
    at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.processBatch$1(BatchingExecutor.scala:67)
    at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:82)
    at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
    at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
    at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
    at akka.dispatch.BatchingExecutor$Batch.run(BatchingExecutor.scala:58)
    at akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.unbatchedExecute(Future.scala:74)
    at akka.dispatch.BatchingExecutor$class.execute(BatchingExecutor.scala:110)
    at akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.execute(Future.scala:73)
    at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
    at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
    at akka.pattern.PromiseActorRef.$bang(AskSupport.scala:269)
    at akka.actor.EmptyLocalActorRef.specialHandle(ActorRef.scala:512)
    at akka.actor.DeadLetterActorRef.specialHandle(ActorRef.scala:545)
    at akka.actor.DeadLetterActorRef.$bang(ActorRef.scala:535)
    at akka.remote.RemoteActorRefProvider$RemoteDeadLetterActorRef.$bang(RemoteActorRefProvider.scala:91)
    at akka.actor.ActorRef.tell(ActorRef.scala:125)
    at akka.dispatch.Mailboxes$$anon$1$$anon$2.enqueue(Mailboxes.scala:44)
    at akka.dispatch.QueueBasedMessageQueue$class.cleanUp(Mailbox.scala:438)
    at akka.dispatch.UnboundedDequeBasedMailbox$MessageQueue.cleanUp(Mailbox.scala:650)
    at akka.dispatch.Mailbox.cleanUp(Mailbox.scala:309)
    at akka.dispatch.MessageDispatcher.unregister(AbstractDispatcher.scala:204)
    at akka.dispatch.MessageDispatcher.detach(AbstractDispatcher.scala:140)
    at akka.actor.dungeon.FaultHandling$class.akka$actor$dungeon$FaultHandling$$finishTerminate(FaultHandling.scala:203)
    at akka.actor.dungeon.FaultHandling$class.terminate(FaultHandling.scala:163)
    at akka.actor.ActorCell.terminate(ActorCell.scala:338)
    at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:431)
    at akka.actor.ActorCell.systemInvoke(ActorCell.scala:447)
    at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:262)
    at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:240)
    at akka.dispatch.Mailbox.run(Mailbox.scala:219)
    at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
    at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
    at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
    at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
    at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
16/11/08 00:19:02 INFO client.AppClient$ClientEndpoint: Connecting to master spark://ubuntuhdp2:7077...
16/11/08 00:19:02 WARN client.AppClient$ClientEndpoint: Could not connect to ubuntuhdp2:7077: akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]
16/11/08 00:19:02 ERROR akka.ErrorMonitor: AssociationError [akka.tcp://sparkDriver@10.0.0.4:53411] -> [akka.tcp://sparkMaster@ubuntuhdp2:7077]: Error [Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]] [
akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]
Caused by: akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2: Connection refused: ubuntuhdp2/10.0.0.4:7077
]
akka.event.Logging$Error$NoCause$
16/11/08 00:19:02 WARN client.AppClient$ClientEndpoint: Could not connect to ubuntuhdp2:7077: akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]
16/11/08 00:19:02 ERROR akka.ErrorMonitor: AssociationError [akka.tcp://sparkDriver@10.0.0.4:53411] -> [akka.tcp://sparkMaster@ubuntuhdp2:7077]: Error [Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]] [
akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]
Caused by: akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2: Connection refused: ubuntuhdp2/10.0.0.4:7077
]
akka.event.Logging$Error$NoCause$
16/11/08 00:19:02 WARN client.AppClient$ClientEndpoint: Could not connect to ubuntuhdp2:7077: akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]
16/11/08 00:19:02 ERROR akka.ErrorMonitor: AssociationError [akka.tcp://sparkDriver@10.0.0.4:53411] -> [akka.tcp://sparkMaster@ubuntuhdp2:7077]: Error [Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]] [
akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]
Caused by: akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2: Connection refused: ubuntuhdp2/10.0.0.4:7077
]
akka.event.Logging$Error$NoCause$
16/11/08 00:19:02 WARN client.AppClient$ClientEndpoint: Could not connect to ubuntuhdp2:7077: akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]
16/11/08 00:19:02 ERROR akka.ErrorMonitor: AssociationError [akka.tcp://sparkDriver@10.0.0.4:53411] -> [akka.tcp://sparkMaster@ubuntuhdp2:7077]: Error [Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]] [
akka.remote.EndpointAssociationException: Association failed with [akka.tcp://sparkMaster@ubuntuhdp2:7077]
Caused by: akka.remote.transport.netty.NettyTransport$$anonfun$associate$1$$anon$2: Connection refused: ubuntuhdp2/10.0.0.4:7077
]
akka.event.Logging$Error$NoCause$
16/11/08 00:19:02 WARN client.AppClient$ClientEndpoint: Failed to connect to master ubuntuhdp2:7077
akka.actor.ActorNotFound: Actor not found for: ActorSelection[Actor[akka.tcp://sparkMaster@ubuntuhdp2:7077/]/user/Master]
    at akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:66)
    at akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:64)
    at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
    at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.processBatch$1(BatchingExecutor.scala:67)
    at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:82)
    at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
    at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
    at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
    at akka.dispatch.BatchingExecutor$Batch.run(BatchingExecutor.scala:58)
    at akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.unbatchedExecute(Future.scala:74)
    at akka.dispatch.BatchingExecutor$class.execute(BatchingExecutor.scala:110)
    at akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.execute(Future.scala:73)
    at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
    at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
    at akka.pattern.PromiseActorRef.$bang(AskSupport.scala:269)
    at akka.actor.EmptyLocalActorRef.specialHandle(ActorRef.scala:512)
    at akka.actor.DeadLetterActorRef.specialHandle(ActorRef.scala:545)
    at akka.actor.DeadLetterActorRef.$bang(ActorRef.scala:535)
    at akka.remote.RemoteActorRefProvider$RemoteDeadLetterActorRef.$bang(RemoteActorRefProvider.scala:91)
    at akka.actor.ActorRef.tell(ActorRef.scala:125)
    at akka.dispatch.Mailboxes$$anon$1$$anon$2.enqueue(Mailboxes.scala:44)
    at akka.dispatch.QueueBasedMessageQueue$class.cleanUp(Mailbox.scala:438)
    at akka.dispatch.UnboundedDequeBasedMailbox$MessageQueue.cleanUp(Mailbox.scala:650)
    at akka.dispatch.Mailbox.cleanUp(Mailbox.scala:309)
    at akka.dispatch.MessageDispatcher.unregister(AbstractDispatcher.scala:204)
    at akka.dispatch.MessageDispatcher.detach(AbstractDispatcher.scala:140)
    at akka.actor.dungeon.FaultHandling$class.akka$actor$dungeon$FaultHandling$$finishTerminate(FaultHandling.scala:203)
    at akka.actor.dungeon.FaultHandling$class.terminate(FaultHandling.scala:163)
    at akka.actor.ActorCell.terminate(ActorCell.scala:338)
    at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:431)
    at akka.actor.ActorCell.systemInvoke(ActorCell.scala:447)
    at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:262)
    at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:240)
    at akka.dispatch.Mailbox.run(Mailbox.scala:219)
    at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
    at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
    at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
    at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
    at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

如果有什么配置错误,请帮助我
这是spark-env.sh

SPARK_JAVA_OPTS=-Dspark.driver.port=53411
HADOOP_CONF_DIR=/usr/lib/hadoop-2.6.0-cdh5.5.0/etc/hadoop/
SPARK_MASTER_IP=ubuntuhdp2
SPARK_DIST_CLASSPATH=$(hadoop classpath):/usr/lib/hadoop-2.6.0-cdh5.5.0/share/hadoop/tools/lib/*

spark-defaults.conf格式

spark.master                     spark://ubuntuhdp2:7077

# spark.eventLog.enabled           true

# spark.eventLog.dir               hdfs://namenode:8021/directory

  spark.serializer                 org.apache.spark.serializer.KryoSerializer

# spark.driver.memory              5g

# spark.executor.extraJavaOptions  -XX:+PrintGCDetails -Dkey=value -Dnumbers="one two three"
ecr0jaav

ecr0jaav1#

我在我的系统上试过这个。线索是这个日志声明:
numexecutors已设置为默认值:2
maxexecutors已设置为默认值:2
即使我增加了遗嘱执行人的数量,我也得到了同样的结果。
所以解决方法很简单:

spark-submit --master yarn-cluster --num-executors 3 word_count.py
b4wnujal

b4wnujal2#

我相信:

spark-submit word_count.py --master yarn-cluster --num-executors 3

你说过要 spark launching “我想投降 word_count.py 还有我的 [application-arguments]--master yarn-cluster --num-executors 3 “所以他选择了默认的主控。
尝试以下操作:

spark-submit --master yarn-cluster --num-executors 3 word_count.py

是的,这应该提交Spark纱和当你定义 -cluster ,您的应用程序运行在集群的某个地方(在“Yarn资源节点”上)。

相关问题