flink的通用avro反序列化程序:重写getproducedtype

yxyvkwin  于 2021-06-24  发布在  Flink
关注(0)|答案(1)|浏览(502)

我想创建一个通用的avro反序列化程序,并与kafka/flink一起使用。
为此,我必须从flink api扩展反序列化模式:

import java.io.ByteArrayInputStream

import com.sksamuel.avro4s.{AvroInputStream, FromRecord, SchemaFor, ToRecord}
import org.apache.flink.api.common.serialization.DeserializationSchema
import org.apache.flink.api.common.typeinfo.TypeInformation
import org.apache.flink.api.java.typeutils.TypeExtractor

class FidesGenericAvroDeserializer[T](implicit schema: SchemaFor[T], toRecord: ToRecord[T], fromRecord: FromRecord[T])
  extends DeserializationSchema[T] {

  override def isEndOfStream(nextElement: T): Boolean = false

  override def deserialize(message: Array[Byte]): T = {
    AvroInputStream.binary[T](new ByteArrayInputStream(message)).iterator.toSeq.head
  }

  override def getProducedType: TypeInformation[T] = TypeExtractor.getForClass(classOf[T])
}

这样做会导致编译时出现问题,因为t似乎不是类:

class type required but T found
override def getProducedType: TypeInformation[T] = TypeExtractor.getForClass(classOf[T])
fnx2tebb

fnx2tebb1#

我回答我自己的问题。我必须使用 ClassTag 强制类型 asInstanceOf 但现在起作用了:

import java.io.ByteArrayInputStream

import com.sksamuel.avro4s.{AvroInputStream, FromRecord, SchemaFor, ToRecord}
import org.apache.flink.api.common.serialization.DeserializationSchema
import org.apache.flink.api.common.typeinfo.TypeInformation
import org.apache.flink.api.java.typeutils.TypeExtractor

import scala.reflect.ClassTag
import scala.reflect._

class FidesGenericAvroDeserializer[T: ClassTag](implicit schema: SchemaFor[T], toRecord: ToRecord[T], fromRecord: FromRecord[T])
  extends DeserializationSchema[T] {

  override def isEndOfStream(nextElement: T): Boolean = false

  override def deserialize(message: Array[Byte]): T = {
    AvroInputStream.binary[T](new ByteArrayInputStream(message)).iterator.toSeq.head
  }

  override def getProducedType: TypeInformation[T] =
    TypeExtractor.getForClass(classTag[T].runtimeClass).asInstanceOf[TypeInformation[T]]

}

相关问题