得到错误当尝试连接Kafka主题到postgres使用jdbcsink连接器
这些是错误日志(见图片),我得到的时候,尝试与配置
{
"name": "temperature_jdbcsink",
"config" : {
"connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector",
"task.max": "1",
"topics": "temperature",
"key.converter": "io.confluent.connect.avro.AvroConverter",
"value.converter": "io.confluent.connect.avro.AvroConverter",
"key.converter.schema.registry.url": "http://localhost:8081",
"value.converter.schema.registry.url": "http://localhost:8081",
"transforms": "Flatten, RenameFields",
"transfores.Flatten.type":"org.apache.kafka.connect.transforms.Flatten$value",
"transforms.Flatten.deliniter":"_",
"transforms.RenameFields.type": "org.apache.kafka.connect.transforms.ReplaceField$value",
"transforms.RenameFields.renames": "value:value,timestamp:timestamp",
"connection.url": "jdbc:postgresql://localhost:5432/jdbcsink",
"connection.user": "postgres",
"connection.password": "postgres",
"insert.mode": "upsert",
"batch.size":"2",
"table.name.format": "temperature",
"pk.mode":"none",
"db.timezone": "Asia/Kolkata"
}
}
https://i.stack.imgur.com/fXqO3.png
https://i.stack.imgur.com/fXqO3.png
https://i.stack.imgur.com/V5Btk.png
1条答案
按热度按时间cwxwcias1#
错误信息为
Unknown magic byte
。主题中似乎存在不是使用Confluent Avro序列化程序生成的数据例如,你的键真的是Avro吗?这对于在JDBC接收器中存储数据来说并不常见,因为数据库主键通常是纯字符串或整数类型。因此,对这些类型使用相应的转换器,而不是Avro