docker Spark容器启动失败

bxgwgixi  于 2023-05-06  发布在  Docker
关注(0)|答案(1)|浏览(133)

我是一个使用spark和containers的新手,实际上,我试图在从docker hub下载Apache/spark镜像后在docker windows 11上运行spark容器,但我在尝试运行容器时遇到以下错误,我不知道如何以及在哪里开始故障排除,如果有人能帮助我,我将非常感激
下面是日志:

2023-03-28 11:47:04 ++ id -u
2023-03-28 11:47:04 + myuid=185
2023-03-28 11:47:04 ++ id -g
2023-03-28 11:47:04 + mygid=0
2023-03-28 11:47:04 + set +e
2023-03-28 11:47:04 ++ getent passwd 185
2023-03-28 11:47:04 + uidentry=
2023-03-28 11:47:04 + set -e
2023-03-28 11:47:04 + '[' -z '' ']'
2023-03-28 11:47:04 + '[' -w /etc/passwd ']'
2023-03-28 11:47:04 + echo '185:x:185:0:anonymous uid:/opt/spark:/bin/false'
2023-03-28 11:47:04 + '[' -z /usr/local/openjdk-11 ']'
2023-03-28 11:47:04 + SPARK_CLASSPATH=':/opt/spark/jars/*'
2023-03-28 11:47:04 + env
2023-03-28 11:47:04 + grep SPARK_JAVA_OPT_
2023-03-28 11:47:04 + sort -t_ -k4 -n
2023-03-28 11:47:04 + sed 's/[^=]*=\(.*\)/\1/g'
2023-03-28 11:47:04 + readarray -t SPARK_EXECUTOR_JAVA_OPTS
2023-03-28 11:47:04 + '[' -n '' ']'
2023-03-28 11:47:04 + '[' -z ']'
2023-03-28 11:47:04 + '[' -z ']'
2023-03-28 11:47:04 + '[' -n '' ']'
2023-03-28 11:47:04 + '[' -z ']'
2023-03-28 11:47:04 + '[' -z ']'
2023-03-28 11:47:04 + '[' -z x ']'
2023-03-28 11:47:04 + SPARK_CLASSPATH='/opt/spark/conf::/opt/spark/jars/*'
2023-03-28 11:47:04 + case "$1" in
2023-03-28 11:47:04 + echo 'Non-spark-on-k8s command provided, proceeding in pass-through mode...'
2023-03-28 11:47:04 + CMD=("$@")
2023-03-28 11:47:04 + exec /usr/bin/tini -s --
2023-03-28 11:47:04 tini (tini version 0.19.0)
2023-03-28 11:47:04 Usage: tini [OPTIONS] PROGRAM -- [ARGS] | --version
2023-03-28 11:47:04 
2023-03-28 11:47:04 Execute a program under the supervision of a valid init process (tini)
2023-03-28 11:47:04 
2023-03-28 11:47:04 Command line options:
2023-03-28 11:47:04 
2023-03-28 11:47:04   --version: Show version and exit.
2023-03-28 11:47:04   -h: Show this help message and exit.
2023-03-28 11:47:04   -s: Register as a process subreaper (requires Linux >= 3.4).
2023-03-28 11:47:04   -p SIGNAL: Trigger SIGNAL when parent dies, e.g. "-p SIGKILL".
2023-03-28 11:47:04   -v: Generate more verbose output. Repeat up to 3 times.
2023-03-28 11:47:04   -w: Print a warning when processes are getting reaped.
2023-03-28 11:47:04   -g: Send signals to the child's process group.
2023-03-28 11:47:04   -e EXIT_CODE: Remap EXIT_CODE (from 0 to 255) to 0.
2023-03-28 11:47:04   -l: Show license and exit.
2023-03-28 11:47:04 
2023-03-28 11:47:04 Environment variables:
2023-03-28 11:47:04 
2023-03-28 11:47:04   TINI_SUBREAPER: Register as a process subreaper (requires Linux >= 3.4).
2023-03-28 11:47:04   TINI_VERBOSITY: Set the verbosity level (default: 1).
2023-03-28 11:47:04   TINI_KILL_PROCESS_GROUP: Send signals to the child's process group.
2023-03-28 11:47:04 
2023-03-28 11:47:04 Non-spark-on-k8s command provided, proceeding in pass-through mode...

最好的问候

bihw5rsg

bihw5rsg1#

我遇到了同样的问题,我找到了解决办法。运行docker容器时缺少一个命令,即你可能正在运行一个类似这样的命令:docker run spark-image-name .
如果你想运行一个Spark应用程序,你应该将应用程序的参数传递给docker run命令,在图像名称之后。
例如,要提交名为myApp.py的Spark应用程序,您可以使用以下命令:

docker run apache/spark:3.4.0 spark-submit myApp.py

如果您尝试运行spark-submit以外的命令,也可以将其作为参数传递给docker run。例如:

docker run apache/spark:3.4.0 ls -al /opt/spark

docker run apache/spark:3.4.0 /bin/bash

这些将在容器中运行命令,而不是启动Spark应用程序。

相关问题