azure databricks,无法初始化类org.apache.spark.eventhubs.eventhubsconf

p3rjfoxz  于 2021-05-27  发布在  Spark
关注(0)|答案(1)|浏览(467)

我对scala还很陌生,我正在尝试创建一个笔记本来详细说明在azure事件中心中编写的数据。这是我的密码:

import org.apache.spark.eventhubs._

val connectionString = ConnectionStringBuilder("MY-CONNECTION-STRING")
  .setEventHubName("EVENT-HUB-NAME")
  .build

val eventHubsConf = EventHubsConf(connectionString)
  .setStartingPosition(EventPosition.fromEndOfStream)

val eventhubs = spark.readStream
  .format("eventhubs")
  .options(eventHubsConf.toMap)
  .load()

我得到以下错误: java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.eventhubs.EventHubsConf$ 群集配置:
databricks运行时版本:7.0(包括apache spark 3.0.0、scala 2.12)
驱动程序和工作程序类型:14.0 gb内存,4核,0.75 dbu标准\u ds3 \u v2
我已安装以下库: com.microsoft.azure:azure-eventhubs-spark_2.11:2.3.17 群集库
安装的另一个jar是为了解决日志记录问题
我一尝试创建eventhubsconf,代码就会崩溃。
完全回溯:

java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.eventhubs.EventHubsConf$
at line14a6ae940dd14957b7172a4cf8f6cdd348.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-2632683088190841:7)
    at line14a6ae940dd14957b7172a4cf8f6cdd348.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-2632683088190841:70)
    at line14a6ae940dd14957b7172a4cf8f6cdd348.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-2632683088190841:72)
    at line14a6ae940dd14957b7172a4cf8f6cdd348.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-2632683088190841:74)
    at line14a6ae940dd14957b7172a4cf8f6cdd348.$read$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-2632683088190841:76)
    at line14a6ae940dd14957b7172a4cf8f6cdd348.$read$$iw$$iw$$iw$$iw$$iw.<init>(command-2632683088190841:78)
    at line14a6ae940dd14957b7172a4cf8f6cdd348.$read$$iw$$iw$$iw$$iw.<init>(command-2632683088190841:80)
    at line14a6ae940dd14957b7172a4cf8f6cdd348.$read$$iw$$iw$$iw.<init>(command-2632683088190841:82)
    at line14a6ae940dd14957b7172a4cf8f6cdd348.$read$$iw$$iw.<init>(command-2632683088190841:84)
    at line14a6ae940dd14957b7172a4cf8f6cdd348.$read$$iw.<init>(command-2632683088190841:86)
    at line14a6ae940dd14957b7172a4cf8f6cdd348.$read.<init>(command-2632683088190841:88)
    at line14a6ae940dd14957b7172a4cf8f6cdd348.$read$.<init>(command-2632683088190841:92)
    at line14a6ae940dd14957b7172a4cf8f6cdd348.$read$.<clinit>(command-2632683088190841)
    at line14a6ae940dd14957b7172a4cf8f6cdd348.$eval$.$print$lzycompute(<notebook>:7)
    at line14a6ae940dd14957b7172a4cf8f6cdd348.$eval$.$print(<notebook>:6)
    at line14a6ae940dd14957b7172a4cf8f6cdd348.$eval.$print(<notebook>)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:745)
    at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1021)
    at scala.tools.nsc.interpreter.IMain.$anonfun$interpret$1(IMain.scala:574)
    at scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:41)
    at es Scala 2.12 but yoscala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:37)
    at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:41)
    at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:573)
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:600)
    at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:570)
    at com.databricks.backend.daemon.driver.DriverILoop.execute(DriverILoop.scala:215)
    at com.databricks.backend.daemon.driver.ScalaDriverLocal.$anonfun$repl$1(ScalaDriverLocal.scala:202)
    at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
    at com.databricks.backend.daemon.driver.DriverLocal$TrapExitInternal$.trapExit(DriverLocal.scala:714)
    at com.databricks.backend.daemon.driver.DriverLocal$TrapExit$.apply(DriverLocal.scala:667)
    at com.databricks.backend.daemon.driver.ScalaDriverLocal.repl(ScalaDriverLocal.scala:202)
    at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$10(DriverLocal.scala:396)
    at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:238)
    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
    at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:233)
    at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:230)
    at com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:49)
    at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:275)
    at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:268)
    at com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:49)
    at com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:373)
    at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$1(DriverWrapper.scala:653)
    at scala.util.Try$.apply(Try.scala:213)
    at com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:645)
    at com.databricks.backend.daemon.driver.DriverWrapper.getCommandOutputAndError(DriverWrapper.scala:486)
    at com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:598)
    at com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:391)
    at com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:337)
    at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:219)
    at java.lang.Thread.run(Thread.java:748)
rbl8hiat

rbl8hiat1#

您的运行时似乎包含Scala2.12,但您的包来自Scala2.11
试着安装这个

com.microsoft.azure:azure-eventhubs-spark_2.12:2.3.17

相关问题