java 如何使用Avro序列化与Spring-Kafka

egdjgwm8  于 2023-03-28  发布在  Java
关注(0)|答案(1)|浏览(160)

我试图学习Kafka和现在的Avro,为了保持发送方对象和接收方对象之间的一致性,我们保留了一个JSON模式(.avsc).但我无法找到任何简单的例子来使用它.一些例子是使用confluent(是Avro的confluent任务),有些是通过Avro工具生成对象.到目前为止,我有一个工作的Kafka设置.
对象类

package com.example.kafka;
public class Hello {
String name;
String age;
public Hello(String name, String age) {
    this.name = name;
    this.age = age;
}
@Override
public String toString() {
    return "Hello{" +
            "name='" + name + '\'' +
            ", date='" + age + '\'' +
            '}';
}
public String getName() {
    return name;
}
public void setName(String name) {
    this.name = name;
}
public String getAge() {
    return age;
}
public void setAge(String age) {
    this.age = age;
}

}
控制器类

package com.example.kafka;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
@RestController
@RequestMapping("/")
class KafkaController {
    @Autowired
    KafkaService kafkaService;
    @GetMapping("test")
    public Hello hello() {
        Hello hello = new Hello("shrikant", "25");
        kafkaService.hello(hello);
        return hello;
    }
}

主要应用

package com.example.kafka;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.EnableAutoConfiguration;
import org.springframework.boot.autoconfigure.SpringBootApplication;
@SpringBootApplication
@EnableAutoConfiguration
public class KafkaDemoApplication {
    public static void main(String[] args) {
        SpringApplication.run(KafkaDemoApplication.class, args);
    }
}

KafkaProducerConfig

package com.example.kafka;
import org.apache.kafka.clients.producer.ProducerConfig;
import org.apache.kafka.common.serialization.StringSerializer;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.kafka.core.DefaultKafkaProducerFactory;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.kafka.core.ProducerFactory;
import java.util.HashMap;
import java.util.Map;
@Configuration
public class KafkaProducerConfig {
    @Bean
    public Map<String, Object> producerConfigs() {
        Map<String, Object> props = new HashMap<>();
        props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
        props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
        props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, KafkaSerializer.class);
        return props;
    }
    @Bean
    public ProducerFactory<String, Hello> producerFactory() {
        return new DefaultKafkaProducerFactory<>(producerConfigs());
    }
    @Bean
    public KafkaTemplate<String, Hello> kafkaTemplate() {
        return new KafkaTemplate<>(producerFactory());
    }
}

KafkaSerializer

package com.example.kafka;
import com.fasterxml.jackson.databind.ObjectMapper;
import org.apache.kafka.common.serialization.Serializer;
import java.util.Map;
public class KafkaSerializer implements Serializer<Hello> {
    @Override
    public byte[] serialize(String arg0, Hello developer) {
        byte[] serializedBytes = null;
        ObjectMapper objectMapper = new ObjectMapper();
        try {
            serializedBytes = objectMapper.writeValueAsString(developer).getBytes();
        } catch (Exception e) {
            e.printStackTrace();
        }
        return serializedBytes;
    }
    @Override
    public void close() {
        // TODO Auto-generated method stub
    }
    @Override
    public void configure(Map<String, ?> arg0, boolean arg1) {
        // TODO Auto-generated method stub
    }
}

KafkaService

package com.example.kafka;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.stereotype.Service;
@Service
public class KafkaService {
    @Autowired
    private KafkaTemplate<String, Hello> kafkaTemplate;
    public void hello(Hello hello) {
        kafkaTemplate.send("test", hello);
    }
}

Hello.avsc

{"namespace": "com.example.kafka",
  "type": "record",
  "name": "Hello",
  "fields": [
    {"name": "name", "type": "string"},
    {"name": "age",  "type": "string"},
  ]
}

build.gradle

buildscript {
    repositories {
        mavenCentral()
    }
    dependencies {
        classpath 'org.springframework.boot:spring-boot-gradle-plugin:1.5.21.RELEASE'
    }
}
plugins {
    id 'java'
}
apply plugin: 'org.springframework.boot'
group = 'com.example.kafka'
version = '0.0.1-SNAPSHOT'
sourceCompatibility = '1.8'
repositories {
    mavenCentral()
}
ext {
    set('spring-kafka.version', "2.2.8.RELEASE")
}
dependencies {
    compile 'org.springframework.boot:spring-boot-starter'
    compile 'org.springframework.kafka:spring-kafka'
    implementation 'org.springframework.boot:spring-boot-starter-web'
}

它的工作设置,我能够发送和接收数据,我需要做什么改变,以获得Avro工作。

dfty9e19

dfty9e191#

Confluent维护的教程正是针对此用例:https://docs.confluent.io/platform/current/tutorials/examples/clients/docs/java-springboot.html
您当前仅使用JSON,而不是“JSON模式”。要在当前设置中轻松使用Avro,您必须导入JacksonAvro数据格式Objectmapper
https://github.com/FasterXML/jackson-dataformats-binary/blob/master/avro/README.md

或者(推荐)您可以安装/运行Confluent Schema Registry并使用它们的序列化器,而无需为您想要的每个对象类编写自己的序列化器。Confluent为此提供了KafkaAvroSerializer类,因此您无需实现自己的序列化器。

要实际使用AVSC,您需要从文件系统中读取文件以创建Schema对象,或者可以使用Avro Gradle插件让它为您生成对象类,而不是手动编写对象类,这将使schema作为变量嵌入。https://github.com/commercehub-oss/gradle-avro-plugin
Confluent的例子使用了Avro Maven插件,但想法是类似的。
https://docs.confluent.io/current/schema-registry/schema_registry_tutorial.html#example-producer-code
请注意,使用JacksonAvro编码的消息与Confluent不兼容,因为Confluent序列化消息本身不包含任何Avro模式,因此您不能混合使用这些(反)序列化器

相关问题