我在kubernetes上运行spark集群,使用下面的log4j属性文件。我仍然没有看到控制台日志被重定向到 /var/log/SparkDriver.log
我在文件中看到驱动程序/执行程序日志,但没有看到应用程序输出。例如,下面的日志记录到pod日志而不是文件。
kubectl logs -f logtest2-1592207949203-driver -n sparkloads
...
2020-06-15 [info]: starting fluentd-1.9.2 pid=16 ruby="2.5.1"
2020-06-15 [info]: spawn command to main: cmdline=["/opt/td-agent/embedded/bin/ruby", "-Eascii-8bit:ascii-8bit", "/usr/sbin/td-agent", "-d", "/var/run/td-agent/td-agent.pid", "--under-supervisor"]
Pi is roughly 3.1421357106785535
我想看看最后一行 /var/log/SparkDriver.log
文件。
我错过了什么?
log4j.rootCategory=ALL,FILE
log4j.appender.FILE=org.apache.log4j.RollingFileAppender
log4j.appender.FILE.File=/var/log/SparkDriver.log
log4j.appender.FILE.Append=false
log4j.appender.FILE.layout=org.apache.log4j.PatternLayout
log4j.appender.FILE.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n
添加spark submit命令:
/opt/spark/bin/spark-submit \
--deploy-mode cluster \
--class org.apache.spark.examples.SparkPi \
--master k8s://https://spark.master.com \
--conf spark.executor.instances=2 \
--conf spark.app.name=logtest \
--conf spark.kubernetes.container.image=myimage \
--conf spark.kubernetes.docker.image.pullPolicy=Always \
--conf spark.kubernetes.namespace=spark \
--conf spark.metrics.conf=/opt/spark/monitoring/metrics.properties \
--conf "spark.driver.extraJavaOptions=-Dlog4j.configuration=file:///opt/spark/monitoring/log4j.properties" \
--conf "spark.executor.extraJavaOptions=-Dlog4j.configuration=file:///opt/spark/monitoring/log4j.properties" \
--driver-java-options "-Dlog4j.configuration=file:///opt/spark/monitoring/log4j.properties" \
local:///opt/spark/examples/jars/spark-examples_2.11-2.4.4.jar
暂无答案!
目前还没有任何答案,快来回答吧!