Spring云流Kafkaktable作为输入不工作

w1jd8yoj  于 2021-06-04  发布在  Kafka
关注(0)|答案(1)|浏览(311)

Spring Cloud流Kafka,ktable作为输入不工作
接收器.java

public interface EventSink {
    @Input("inputTable")
    KTable<?, ?> inputTable();
}

消息接收器.java

@EnableBinding(EventSink .class)
public class MessageReceiver {

    @StreamListener
    public void process(@Input("inputTable") KTable<String, Event> KTable) {

        // below code is just for representation. I need to do lot of things after getting this KTable
        KTable.toStream()
                .foreach((key, value) -> System.out.println(value));
    }
}

应用程序.yml

server:
  port: 8083

spring:
  cloud:
    stream:
      kafka:
        streams:
          binder:
            application-id: kafka-stream-demo
            configuration:
              default:
                key:
                  serde: org.apache.kafka.common.serialization.Serdes$StringSerde
                value:
                  serde: org.springframework.kafka.support.serializer.JsonSerde
          bindings:
            inputTable:
              materialized-as: event_store
        binder:
          brokers: localhost:9092
      bindings:
        inputTable:
          destination: nscevent
          group: nsceventGroup

我的错误率正在下降

Exception in thread "kafka-stream-demo-1e64cf93-de19-4185-bee4-8fc882275010-StreamThread-1" org.apache.kafka.streams.errors.StreamsException: Deserialization exception handler is set to fail upon a deserialization error. If you would rather have the streaming pipeline continue after a deserialization error, please set the default.deserialization.exception.handler appropriately.
    at org.apache.kafka.streams.processor.internals.RecordDeserializer.deserialize(RecordDeserializer.java:80)
    at org.apache.kafka.streams.processor.internals.RecordQueue.addRawRecords(RecordQueue.java:97)
    at org.apache.kafka.streams.processor.internals.PartitionGroup.addRawRecords(PartitionGroup.java:117)
    at org.apache.kafka.streams.processor.internals.StreamTask.addRecords(StreamTask.java:677)
    at org.apache.kafka.streams.processor.internals.StreamThread.addRecordsToTasks(StreamThread.java:943)
    at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:831)
    at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:767)
    at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:736)
Caused by: java.lang.IllegalStateException: No type information in headers and no default type provided
    at org.springframework.util.Assert.state(Assert.java:73)
    at org.springframework.kafka.support.serializer.JsonDeserializer.deserialize(JsonDeserializer.java:370)
    at org.apache.kafka.streams.processor.internals.SourceNode.deserializeValue(SourceNode.java:63)
    at org.apache.kafka.streams.processor.internals.RecordDeserializer.deserialize(RecordDeserializer.java:66)
    ... 7 more

有人能告诉我是什么问题吗?以kstream作为输入,它正在工作,但不是ktable。提前谢谢

jyztefdp

jyztefdp1#

ktable总是使用kafka流的本机serde特性进行转换。框架级别的转换没有在ktable上完成(尽管添加它有一个问题)。由于您对值使用自定义类型,因此需要指定正确的serde,而不是使用默认的字符串serde。您可以将这些添加到配置中。

spring.cloud.stream.kafka.streams.binder.configuration:
  default.value.serde: org.springframework.kafka.support.serializer.JsonSerde
  spring.json.value.default.type: RawAccounting

ktable不自动转换为输入通道

相关问题