无法为spark到hdfs创建日志

guicsvcw  于 2021-05-29  发布在  Hadoop
关注(0)|答案(0)|浏览(230)

我正在试着写一个日志到 HDFS 为了我的 Spark 以群集模式执行。我用的是 log4j 属性,但没有看到生成任何日志。


# Root logger option

log4j.rootLogger=INFO, file

# Direct log messages to a log file

log4j.appender.file=org.apache.log4j.rolling.RollingFileAppender
log4j.appender.file.File=hdfs://devnameservice/acceptance/cls/data/DateAdjust1.log
log4j.appender.file.rollingPolicy=org.apache.log4j.rolling.TimeBasedRollingPolicy

log4j.appender.file.triggeringPolicy=org.apache.log4j.rolling.SizeBasedTriggeringPolicy
log4j.appender.file.triggeringPolicy.maxFileSize=10000
log4j.appender.file.layout=org.apache.log4j.PatternLayout
log4j.appender.file.layout.ConversionPattern="eventTime":"%d{YYYY-MM-dd HH:mm:ss[z]Z}","level":"%p","correlationUUID":"%X{UUID}","ComponentName":"%c","MethodName":"%M","applicationId":"%X{applicationId}","System-IPv4":"%X{ipAddress}","System-port":"%X{systemPort}","User":"%X{user}","Action":"%X{action}","Status":"%X{status},"Reason":"%X{reason}","Message":"%m - Counter:%X{Counter} - Pipeline:%X{pipeLine} - Stage:%X{stageType}|%X{stageName} - Step:%X{stepType}|%X{stepName}"%n

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题