我正在使用kafka elasticsearch接收器连接器将传入消息传递给es,但遇到以下问题
[2018-10-05 13:01:21,388] ERROR WorkerSinkTask{id=elasticsearch.sink.direct-
10} Task threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask:172)
org.apache.kafka.connect.errors.DataException: Converting byte[] to Kafka Connect data failed due to serialization error:
Caused by: org.apache.kafka.common.errors.SerializationException: com.fasterxml.jackson.core.JsonParseException: Illegal character ((CTRL-CHAR, code 0)): only regular white space (\r, \n, \t) is allowed between tokens
at [Source: (byte[])" "; line: 1, column: 2]
Caused by: com.fasterxml.jackson.core.JsonParseException: Illegal character ((CTRL-CHAR, code 0)): only regular white space (\r, \n, \t) is allowed between tokens
at [Source: (byte[])" "; line: 1, column: 2]
运行print.key属性设置为true的控制台使用者时,传入的键值消息如下所示
{
"schema": {
"type": "struct",
"fields": [{
"type": "int32",
"optional": false,
"field": "MY_SETTING_ID"
}
],
"optional": false
},
"payload": {
"MY_SETTING_ID": 9
}
}
{
"schema": {
"type": "struct",
"fields": [{
"type": "int32",
"optional": false,
"field": "MY_SETTING_ID"
}, {
"type": "string",
"optional": true,
"field": "MY_SETTING_NAME"
}
],
"optional": false
},
"payload": {
"MY_SETTING_ID": 9,
"MY_SETTING_NAME": "setting_name"
}
}
在这里,my\u setting\u id作为键。
我有以下独立的属性文件
bootstrap.servers=dev-insight-kafka01:9092,dev-insight-kafka02:9092,dev-
insight-kafka03:9092
key.converter=org.apache.kafka.connect.json.JsonConverter
value.converter=org.apache.kafka.connect.json.JsonConverter
key.converter.schemas.enable=false
value.converter.schemas.enable=false
internal.key.converter=org.apache.kafka.connect.json.JsonConverter
internal.value.converter=org.apache.kafka.connect.json.JsonConverter
internal.key.converter.schemas.enable=false
internal.value.converter.schemas.enable=false
offset.storage.file.filename=/apps/dev/logs/offsets/elasticsearch-direct.offsets
offset.flush.interval.ms=120000
rest.port=8099
plugin.path=/usr/share/java
producer.max.request.size = 10485760
consumer.auto.offset.reset=latest
consumer.session.timeout.ms=300000
consumer.request.timeout.ms=310000
flush.timeout.ms=160000
heartbeat.interval.ms= 60000
session.timeout.ms= 200000
以及Flume道具文件:
name=elasticsearch.sink.direct
connector.class=io.confluent.connect.elasticsearch.ElasticsearchSinkConnector
tasks.max=16
topics=stream.app_setting
connection.url=http://dev-elastic-search01:9200
type.name=logs
topic.index.map=stream.app_setting:direct_app_setting_index
batch.size=2048
max.buffered.records=32768
flush.timeout.ms=60000
max.retries=10
retry.backoff.ms=1000
schema.ignore=true
如果有人能查看我的财产档案并告诉我哪里出了问题,我将不胜感激
1条答案
按热度按时间a14dhokn1#
既然您将模式作为json的一部分,那么应该设置