kafka连接:如何提取字段

eyh26e7m  于 2021-06-04  发布在  Kafka
关注(0)|答案(1)|浏览(419)

我正在使用debeziumsqlserver连接器将一个表流化为一个主题。多亏了debezium ExtractNewRecordState smt,我在我的主题中得到以下信息。

{
   "schema":{
      "type":"struct",
      "fields":[
         {
            "type":"int64",
            "optional":false,
            "field":"id"
         },
         {
            "type":"string",
            "optional":false,
            "field":"customer_code"
         },
         {
            "type":"string",
            "optional":false,
            "field":"topic_name"
         },
         {
            "type":"string",
            "optional":true,
            "field":"payload_key"
         },
         {
            "type":"boolean",
            "optional":false,
            "field":"is_ordered"
         },
         {
            "type":"string",
            "optional":true,
            "field":"headers"
         },
         {
            "type":"string",
            "optional":false,
            "field":"payload"
         },
         {
            "type":"int64",
            "optional":false,
            "name":"io.debezium.time.Timestamp",
            "version":1,
            "field":"created_on"
         }
      ],
      "optional":false,
      "name":"test_server.dbo.kafka_event.Value"
   },
   "payload":{
      "id":129,
      "customer_code":"DVTPRDFT411",
      "topic_name":"DVTPRDFT411",
      "payload_key":null,
      "is_ordered":false,
      "headers":"{\"kafka_timestamp\":1594566354199}",
      "payload":"MSG 18",
      "created_on":1594595154267
   }
}

添加后 value.converter.schemas.enable=false ,我可以摆脱 schema 部分和唯一的 payload 零件如下图所示。

{
   "id":130,
   "customer_code":"DVTPRDFT411",
   "topic_name":"DVTPRDFT411",
   "payload_key":null,
   "is_ordered":false,
   "headers":"{\"kafka_timestamp\":1594566354199}",
   "payload":"MSG 19",
   "created_on":1594595154280
}

我想再往前走一步,只提取 customer_code 现场。我试过了 ExtractField$Value 但我一直有例外 IllegalArgumentException: Unknown field: customer_code .
我的配置如下

transforms=unwrap,extract
transforms.unwrap.type=io.debezium.transforms.ExtractNewRecordState
transforms.unwrap.drop.tombstones=true
transforms.unwrap.delete.handling.mode=drop
transforms.extract.type=org.apache.kafka.connect.transforms.ExtractField$Key
transforms.extract.field=customer_code

我尝试了很多其他的SMT,包括 ExtractField$Key , ValueToKey 但我没能成功。如果你能告诉我我做错了什么,我将非常感激。根据这个来自confluent的教程,它应该可以工作,但实际上没有。

更新

我正在运行Kafka连接使用 connect-standalone worker.properties sqlserver.properties . worker.properties ```
offset.storage.file.filename=C:/development/kafka_2.12-2.5.0/data/kafka/connect/connect.offsets
plugin.path=C:/development/kafka_2.12-2.5.0/plugins
bootstrap.servers=127.0.0.1:9092
offset.flush.interval.ms=10000
rest.port=10082
rest.host.name=127.0.0.1
rest.advertised.port=10082
rest.advertised.host.name=127.0.0.1

internal.key.converter=org.apache.kafka.connect.json.JsonConverter
internal.value.converter=org.apache.kafka.connect.json.JsonConverter
internal.key.converter.schemas.enable=false
internal.value.converter.schemas.enable=false
key.converter=org.apache.kafka.connect.json.JsonConverter
value.converter=org.apache.kafka.connect.json.JsonConverter
value.converter.schemas.enable=false
`sqlserver.properties`
name=sql-server-connector
connector.class=io.debezium.connector.sqlserver.SqlServerConnector
database.hostname=127.0.0.1
database.port=1433
database.user=sa
database.password=dummypassword
database.dbname=STGCTR
database.history.kafka.bootstrap.servers=127.0.0.1:9092

database.server.name=wfo
table.whitelist=dbo.kafka_event
database.history.kafka.topic=db_schema_history
transforms=unwrap,extract
transforms.unwrap.type=io.debezium.transforms.ExtractNewRecordState
transforms.unwrap.drop.tombstones=true
transforms.unwrap.delete.handling.mode=drop
transforms.extract.type=org.apache.kafka.connect.transforms.ExtractField$Value
transforms.extract.field=customer_code

vtwuwzda

vtwuwzda1#

这个 schema 以及 payload 字段听起来像是使用启用了模式的jsonconverter序列化的数据。
你只需要设置 value.converter.schemas.enable=false 实现你的目标。

相关问题