使用logstash到elasticsearch消费Kafka主题

ruyhziif  于 2021-06-07  发布在  Kafka
关注(0)|答案(1)|浏览(779)

我开始使用从kafka到logstash的消息,我想将完整的主题发送到elasticsearch,但我在logstash上没有收到任何消息,我可以在kafka中看到消息,但是从kafka方面看不到任何内容,正确的配置方法是什么?

input {
   kafka {
   zk_connect => "localhost:2181"
   topic_id => "event"
  }
}

output{
   stdout{
     codec => rubydebug
   }
   elasticsearch{
    index => "event-%{+YYYY.MM.dd}"
    hosts => ["localhost:9201"]
    codec => json
   }
}

curl localhost:9201
{
  "name" : "Flex",
  "cluster_name" : "elasticsearch",
  "version" : {
    "number" : "2.3.4",
    "build_hash" : "e455fd0c13dceca8dbbdbb1665d068ae55dabe3f",
    "build_timestamp" : "2016-06-30T11:24:31Z",
    "build_snapshot" : false,
    "lucene_version" : "5.5.0"
  },
  "tagline" : "You Know, for Search"
}

命令:

/kafka-console-consumer.sh --zookeeper localhost:2181 --topic event

不时产生结果。

ovfsdjhp

ovfsdjhp1#

用这个试试 auto_offset_reset 以及 reset_beginning :

kafka {
    topic_id => "event"
    zk_connect => "localhost:2181"
    group_id => "event-group"
    auto_offset_reset => "smallest"
    reset_beginning => true
    consumer_threads => 1
  }

相关问题