由于kafka log4jappender,启动Yarn作业时出现异常

2izufjch  于 2021-06-04  发布在  Kafka
关注(0)|答案(1)|浏览(434)

当我使用log4j.properties作为配置运行我的Yarn作业时,它会失败,出现以下异常。如果我把Kafka从rootlogger中除名,工作就可以启动了。
这与本文报道的问题相同:https://github.com/wso2/product-ei/issues/2786
但我还没有找到解决办法。
环境:cdh 6.3.3
这是我的log4j.properties文件。

log4j.rootLogger=DEBUG,stdout,KAFKA
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.Threshold=DEBUG
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%-5p %L %X{taskId} %X{stsId}     %d{yyyy-MM-dd HH:mm:ss}     %c     %t     %m%n  

log4j.appender.alog=org.apache.log4j.RollingFileAppender
log4j.appender.alog.maxFileSize=10MB
log4j.appender.alog.maxBackupIndex=5
log4j.appender.alog.file=../logs/serverx.log
log4j.appender.alog.append=false
log4j.appender.alog.layout=org.apache.log4j.PatternLayout
log4j.appender.alog.layout.conversionPattern=%-5p %X{taskId} 
%X{stsId}  %d{yyyy-MM-dd HH:mm:ss}     %c     %t     %m%n
log4j.appender.KAFKA=org.apache.kafka.log4jappender.KafkaLog4jAppender
log4j.appender.KAFKA.layout=org.apache.log4j.EnhancedPatternLayout
log4j.appender.KAFKA.layout.conversionPattern=%d%C{1}%t%5p %-4p%X{taskId} %X{stsId} %d{yyyy-MM-dd HH:mm:ss} %c %t %m%n%throwable

log4j.appender.KAFKA.topic=cdhuser_rocplus_roclog
log4j.appender.KAFKA.securityProtocol=PLAINTEXT
log4j.appender.KAFKA.ignoreExceptions=false

例外情况:

Unexpected problem occured during version sanity check
Reported exception:
java.lang.NullPointerException
    at org.slf4j.LoggerFactory.versionSanityCheck(LoggerFactory.java:267)
    at org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:126)
    at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:412)
    at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:357)
    at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:383)
    at org.apache.kafka.clients.CommonClientConfigs.<clinit>(CommonClientConfigs.java:32)
    at org.apache.kafka.clients.producer.ProducerConfig.<clinit>(ProducerConfig.java:333)
    at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:327)
    at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:299)
    at org.apache.kafka.log4jappender.KafkaLog4jAppender.getKafkaProducer(KafkaLog4jAppender.java:279)
    at org.apache.kafka.log4jappender.KafkaLog4jAppender.activateOptions(KafkaLog4jAppender.java:273)
    at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
    at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
    at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
    at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
    at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
    at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
    at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
    at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
    at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
    at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
    at org.slf4j.impl.Log4jLoggerFactory.<init>(Log4jLoggerFactory.java:66)
    at org.slf4j.impl.StaticLoggerBinder.<init>(StaticLoggerBinder.java:72)
    at org.slf4j.impl.StaticLoggerBinder.<clinit>(StaticLoggerBinder.java:45)
    at org.apache.spark.internal.Logging$.org$apache$spark$internal$Logging$$isLog4j12(Logging.scala:217)
    at org.apache.spark.internal.Logging$class.initializeLogging(Logging.scala:122)
    at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:111)
    at org.apache.spark.deploy.yarn.ApplicationMaster$.initializeLogIfNecessary(ApplicationMaster.scala:771)
    at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:102)
    at org.apache.spark.deploy.yarn.ApplicationMaster$.initializeLogIfNecessary(ApplicationMaster.scala:771)
    at org.apache.spark.internal.Logging$class.log(Logging.scala:49)
    at org.apache.spark.deploy.yarn.ApplicationMaster$.log(ApplicationMaster.scala:771)
    at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:786)
    at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
Exception in thread "main" java.lang.ExceptionInInitializerError
    at org.apache.kafka.clients.producer.ProducerConfig.<clinit>(ProducerConfig.java:333)
    at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:327)
    at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:299)
    at org.apache.kafka.log4jappender.KafkaLog4jAppender.getKafkaProducer(KafkaLog4jAppender.java:279)
    at org.apache.kafka.log4jappender.KafkaLog4jAppender.activateOptions(KafkaLog4jAppender.java:273)
    at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
    at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
    at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
    at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
    at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
    at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
    at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
    at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
    at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
    at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
    at org.slf4j.impl.Log4jLoggerFactory.<init>(Log4jLoggerFactory.java:66)
    at org.slf4j.impl.StaticLoggerBinder.<init>(StaticLoggerBinder.java:72)
    at org.slf4j.impl.StaticLoggerBinder.<clinit>(StaticLoggerBinder.java:45)
    at org.apache.spark.internal.Logging$.org$apache$spark$internal$Logging$$isLog4j12(Logging.scala:217)
    at org.apache.spark.internal.Logging$class.initializeLogging(Logging.scala:122)
    at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:111)
    at org.apache.spark.deploy.yarn.ApplicationMaster$.initializeLogIfNecessary(ApplicationMaster.scala:771)
    at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:102)
    at org.apache.spark.deploy.yarn.ApplicationMaster$.initializeLogIfNecessary(ApplicationMaster.scala:771)
    at org.apache.spark.internal.Logging$class.log(Logging.scala:49)
    at org.apache.spark.deploy.yarn.ApplicationMaster$.log(ApplicationMaster.scala:771)
    at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:786)
    at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
Caused by: java.lang.NullPointerException
    at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:418)
    at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:357)
    at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:383)
    at org.apache.kafka.clients.CommonClientConfigs.<clinit>(CommonClientConfigs.java:32)
    ... 28 more
kmynzznz

kmynzznz1#

CDH6.3.3也有同样的问题
看起来kafka-log4j库有问题-appender:2.2.1-cdh6.3.3 and slf4j型-api:1.7.25.
使用的工作区是指定不同版本的slf4j api和slf4j-log4j12库(不适用于1.7版本,我们必须使用1.8版本)。
解决这个问题的步骤是:
首先,必须在pom.xml文件中将这些库指定为依赖项。。。
1.8.0-β4

<dependency>
      <groupId>org.slf4j</groupId>
      <artifactId>slf4j-api</artifactId>
      <version>${slf4j.version}</version>
  </dependency>

  <dependency>
      <groupId>org.slf4j</groupId>
      <artifactId>slf4j-log4j12</artifactId>
      <version>${slf4j.version}</version>
  </dependency>

使用插件将此依赖项复制到 target 目录。。。

<plugin>
              <groupId>org.apache.maven.plugins</groupId>
              <artifactId>maven-dependency-plugin</artifactId>
              <version>3.1.1</version>
              <executions>
                  <execution>
                      <phase>package</phase>
                      <goals>
                          <goal>copy-dependencies</goal>
                      </goals>
                  </execution>
              </executions>
              <configuration>
                  <includeScope>provided</includeScope>
                  <outputDirectory>target</outputDirectory>
                  <includeArtifactIds>slf4j-api,slf4j-log4j12</includeArtifactIds>
                  <stripVersion>true</stripVersion>
              </configuration>
          </plugin>

包括spark的以下属性:

spark.driver.extraClassPath=slf4j-api.jar:slf4j-log4j12.jar
spark.executor.extraClassPath=slf4j-api.jar:slf4j-log4j12.jar

最后用这个参数 spark-submit :

--jars slf4j-api.jar,slf4j-log4j12.jar

相关问题