无法在scala测试期间创建本地spark会话。log4j:错误无法创建Appender

xfb7svmp  于 2022-11-06  发布在  Scala
关注(0)|答案(1)|浏览(190)

当我尝试使用funsuite在scala测试中创建一个本地spark会话时,我得到了以下错误

log4j:ERROR Could not create an Appender. Reported error follows. java.lang.ClassNotFoundException: com.microsoft.applicationinsights.log4j.v1_2.ApplicationInsightsAppender at java.net.URLClassLoader.findClass(URLClassLoader.java:387) at java.lang.ClassLoader.loadClass(ClassLoader.java:418) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:355) at java.lang.ClassLoader.loadClass(ClassLoader.java:351) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:264) at org.apache.log4j.helpers.Loader.loadClass(Loader.java:198) at org.apache.log4j.xml.DOMConfigurator.parseAppender(DOMConfigurator.java:247) at org.apache.log4j.xml.DOMConfigurator.findAppenderByName(DOMConfigurator.java:176) at org.apache.log4j.xml.DOMConfigurator.findAppenderByReference(DOMConfigurator.java:191) at org.apache.log4j.xml.DOMConfigurator.parseChildrenOfLoggerElement(DOMConfigurator.java:523) at org.apache.log4j.xml.DOMConfigurator.parseRoot(DOMConfigurator.java:492) at org.apache.log4j.xml.DOMConfigurator.parse(DOMConfigurator.java:1006) at org.apache.log4j.xml.DOMConfigurator.doConfigure(DOMConfigurator.java:872) at org.apache.log4j.xml.DOMConfigurator.doConfigure(DOMConfigurator.java:778) at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526) at org.apache.log4j.LogManager.(LogManager.java:127) at org.slf4j.impl.Log4jLoggerFactory.(Log4jLoggerFactory.java:66) at org.slf4j.impl.StaticLoggerBinder.(StaticLoggerBinder.java:72) at org.slf4j.impl.StaticLoggerBinder.(StaticLoggerBinder.java:45) at org.slf4j.LoggerFactory.bind(LoggerFactory.java:150) at org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:124) at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:417) at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:362) at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:388) at org.apache.spark.network.util.JavaUtils.(JavaUtils.java:41) at org.apache.spark.internal.config.ConfigHelpers$.byteFromString(ConfigBuilder.scala:67) at org.apache.spark.internal.config.ConfigBuilder.$anonfun$bytesConf$1(ConfigBuilder.scala:259) at org.apache.spark.internal.config.ConfigBuilder.$anonfun$bytesConf$1$adapted(ConfigBuilder.scala:259) at org.apache.spark.internal.config.TypedConfigBuilder.$anonfun$transform$1(ConfigBuilder.scala:101) at org.apache.spark.internal.config.TypedConfigBuilder.createWithDefault(ConfigBuilder.scala:144) at org.apache.spark.internal.config.package$.(package.scala:345) at org.apache.spark.internal.config.package$.(package.scala) at org.apache.spark.SparkConf$.(SparkConf.scala:654) at org.apache.spark.SparkConf$.(SparkConf.scala) at org.apache.spark.SparkConf.set(SparkConf.scala:94) at org.apache.spark.SparkConf.set(SparkConf.scala:83) at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$1(SparkSession.scala:916) at scala.collection.mutable.HashMap.$anonfun$foreach$1(HashMap.scala:149) at scala.collection.mutable.HashTable.foreachEntry(HashTable.scala:237) at scala.collection.mutable.HashTable.foreachEntry$(HashTable.scala:230) at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:44) at scala.collection.mutable.HashMap.foreach(HashMap.scala:149) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:916) at com.random.SampleSpec.(SampleSpec.scala:9) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at java.lang.Class.newInstance(Class.java:442) at org.scalatest.tools.DiscoverySuite$.getSuiteInstance(DiscoverySuite.scala:66) at org.scalatest.tools.DiscoverySuite.$anonfun$nestedSuites$1(DiscoverySuite.scala:38) at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286) at scala.collection.Iterator.foreach(Iterator.scala:943) at scala.collection.Iterator.foreach$(Iterator.scala:943) at scala.collection.AbstractIterator.foreach(Iterator.scala:1431) at scala.collection.IterableLike.foreach(IterableLike.scala:74) at scala.collection.IterableLike.foreach$(IterableLike.scala:73) at scala.collection.AbstractIterable.foreach(Iterable.scala:56) at scala.collection.TraversableLike.map(TraversableLike.scala:286) at scala.collection.TraversableLike.map$(TraversableLike.scala:279) at scala.collection.AbstractTraversable.map(Traversable.scala:108) at org.scalatest.tools.DiscoverySuite.(DiscoverySuite.scala:37) at org.scalatest.tools.Runner$.genDiscoSuites$1(Runner.scala:1128) at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1224) at org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24(Runner.scala:993) at org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24$adapted(Runner.scala:971) at org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1480) at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:971) at org.scalatest.tools.Runner$.main(Runner.scala:775) at org.scalatest.tools.Runner.main(Runner.scala)

以下是pom中使用的代码片段和测试配置
第一个
在intelliJ 2021.2.3版本上运行此程序
同样的代码在另一个旧模块中工作。我试图从旧代码中复制几乎所有的东西,但问题仍然存在。试图使其无效并重新启动,构建并重建模块,添加框架以支持scala,但似乎没有任何效果
本地spark安装和pom spark版本匹配,即3.1.3和scala版本为2.12

5jvtdoz2

5jvtdoz21#

显然,您的Log4J配置文件(log4j.xml)引用com.microsoft.applicationinsights.log4j.v1_2.ApplicationInsightsAppender作为附加器,并且您不具有以下依赖项:
工件:应用程序洞察-日志记录-log4j1_2
用户群:com.microsoft.azure
版本号:0.9.0
请参阅:https://jar-download.com/maven-repository-class-search.php?search_box=com.microsoft.applicationinsights.log4j.v1_2.ApplicationInsightsAppender

相关问题