我正在尝试从用scala编写的spark应用程序中创建一个度量源,以便将数据导出到另一个系统,最好是prometheus。根据这个来自数据砖块的站点,我需要创建一个扩展源特性的源。然而,源性状是 private[spark] trait Source
我的消息来源无法想象。当我创建这个类时,我得到了一个错误 Symbol Source is inaccessible from this place
.
package org.sense.spark.util
import org.apache.spark.metrics.source.Source
import com.codahale.metrics.{Counter, Histogram, MetricRegistry}
class MetricSource extends Source {
override val sourceName: String = "MySource"
override val metricRegistry: MetricRegistry = new MetricRegistry
val FOO: Histogram = metricRegistry.histogram(MetricRegistry.name("fooHistory"))
val FOO_COUNTER: Counter = metricRegistry.counter(MetricRegistry.name("fooCounter"))
}
我如何创建我的源来将数据导出到普罗米修斯?我想从 combineByKey
转变。这些值将是聚合的延迟和转换的吞吐量。
这是我的 build.sbt
文件,以防有必要检查我正在使用的库。
name := "explore-spark"
version := "0.2"
scalaVersion := "2.12.3"
val sparkVersion = "3.0.0"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion,
"org.apache.spark" %% "spark-streaming" % sparkVersion % "provided",
"org.apache.spark" %% "spark-sql" % sparkVersion % "provided",
"com.twitter" %% "algebird-core" % "0.13.7",
"joda-time" % "joda-time" % "2.5",
"org.fusesource.mqtt-client" % "mqtt-client" % "1.16"
)
mainClass in(Compile, packageBin) := Some("org.sense.spark.app.App")
mainClass in assembly := Some("org.sense.spark.app.App")
assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false)
assemblyJarName in assembly := s"${name.value}_${scalaBinaryVersion.value}-fat_${version.value}.jar"
2条答案
按热度按时间v7pvogib1#
如果您需要使用prometheus sink而不是控制台sink,您可以使用为spark prometheus sink编写的第三方库。它通过pushgateway工作-https://github.com/banzaicloud/spark-metrics/blob/master/prometheussink.md
t3irkdon2#
您需要将扩展source的类与source放在同一个包中