合流elasticsearch sink regex分配给所有主题并发送到索引

1tuwyuhd  于 2021-06-06  发布在  Kafka
关注(0)|答案(0)|浏览(110)

我有很多这样的主题

client1-table1
client1-table2
client1-table3
client1-table4

我希望我的elasticsearch接收器能够监听任何传入的消息,并将它们发送到相应的索引。但是我目前的配置不起作用。。。我能做什么。。。下面是我的弹性Flume

{
  "name": "es-data",
  "config": {
    "_comment": "-- standard converter stuff -- this can actually go in the worker config globally --",
    "connector.class": "io.confluent.connect.elasticsearch.ElasticsearchSinkConnector",
    "value.converter": "io.confluent.connect.avro.AvroConverter",
    "key.converter": "io.confluent.connect.avro.AvroConverter",
    "key.converter.schema.registry.url": "http://localhost:8081",
    "value.converter.schema.registry.url": "http://localhost:8081",

    "_comment": "--- Elasticsearch-specific config ---",
    "_comment": "Elasticsearch server address",
    "connection.url": "http://127.0.0.1:9200",

    "_comment": "If the Kafka message doesn't have a key (as is the case with JDBC source)  you need to specify key.ignore=true. If you don't, you'll get an error from the Connect task: 'ConnectException: Key is used as document id and can not be null.",
    "key.ignore": "true"
  }
}

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题