object flinkkafkaconsumer010不是org.apache.flink.streaming.connectors.kafka包的成员

blpfk2vs  于 2021-06-05  发布在  Kafka
关注(0)|答案(2)|浏览(598)

我试着用ApacheFlink组装一个小程序来连接Kafka主题。我需要用flinkkafkaconsumer010。

package uimp
import org.apache.flink.streaming.api.scala._
import org.apache.flink.streaming.api.TimeCharacteristic
import org.apache.flink.streaming.util.serialization.SimpleStringSchema
import org.apache.flink.streaming.connectors.kafka.{FlinkKafkaConsumer010}
import java.util.Properties

object Silocompro {
  def main(args: Array[String]): Unit = {
 // set up the execution environment
    val env = StreamExecutionEnvironment.getExecutionEnvironment
    env.setStreamTimeCharacteristic(TimeCharacteristic.ProcessingTime)

    val propertiesTopicDemographic = new Properties()
    propertiesTopicDemographic.setProperty("bootstrap.servers", "bigdata.dataspartan.com:19093")
    propertiesTopicDemographic.setProperty("group.id", "demographic")

    val myConsumerDemographic = new FlinkKafkaConsumer010[String]("topic_demographic", new 
    SimpleStringSchema(), propertiesTopicDemographic)

    val messageStreamDemographic = env
      .addSource(myConsumerDemographic)
      .print()

    env.execute("Flink Scala API Skeleton")

   }
 }

我的问题是,当试图用这个build.sbt组装我的程序时,编译器返回一个错误“object flinkkafcamumer010不是org.apache.flink.streaming.connectors.kafka包的成员”:

ThisBuild / resolvers ++= Seq("Apache Development Snapshot Repository" at 
      "https://repository.apache.org/content/repositories/snapshots/",Resolver.mavenLocal)

      name := "silocompro"

      version := "1.0"

      organization := "uimp"

      ThisBuild / scalaVersion := "2.12.11"

      val flinkVersion = "1.9.0"

      val flinkDependencies = Seq(
         "org.apache.flink" %% "flink-scala" % flinkVersion % "provided",
         "org.apache.flink" %% "flink-streaming-scala" % flinkVersion % "provided",
         "org.apache.flink" %% "flink-core"% flinkVersion % "provided",
         "org.apache.flink" %% "flink-connector-kafka-base" % flinkVersion % "provided",
         "org.apache.flink" %% "flink-clients" % flinkVersion % "provided",
         "org.apache.flink" %% "flink-connector-kafka" % flinkVersion % "provided")

      lazy val root = (project in file(".")).
      settings( libraryDependencies ++= flinkDependencies)

      assembly / mainClass := Some("uimp.Silocompro")

      Compile / run  := Defaults.runTask(Compile / fullClasspath,
                               Compile / run / mainClass,
                               Compile / run / runner
                              ).evaluated

      Compile / run / fork := true
      Global / cancelable := true

      assembly / assemblyOption  := (assembly / assemblyOption).value.copy(includeScala = false)

这个依赖错误的原因是什么?

py49o6xq

py49o6xq1#

最后我遇到了依赖的问题。我做了一些行动:
我添加了一个新的解析器https:/oss.sonatype.org/content/repositories
我已经从vs代码中卸载了pluginmetals(scala)
我添加了“org.apache.flink”%%“flink-connector-kafka-0.10”%flinkversion-tomy flinkdependencies
在这之后,我解决了我的库依赖问题。谢谢

nuypyhwy

nuypyhwy2#

连接器不是flink二进制文件的一部分,这意味着您需要在 compile 范围,所以这基本上意味着您需要删除 provided 从那些依赖中。在此设置中,应用程序将在群集上工作。
但是,如果您想在本地运行它而不启动集群,那么您应该拥有所有的flink依赖项 compile 范围,即删除所有 provided 范围声明。

相关问题