docker:无法将logstash容器中的数据发送到kafka容器

z4iuyo4d  于 2021-06-08  发布在  Kafka
关注(0)|答案(1)|浏览(535)

我有两个码头集装箱,一个运行logstash和其他运行zookeeper和Kafka。我试图发送数据从logstashKafka,但似乎无法得到数据,我在Kafka主题。
我可以登录到docker kafka容器,从终端生成一条关于我的主题的消息,然后也可以使用它。
我正在使用输出Kafka插件:

output {
    kafka {
        topic_id => "MyTopicName"
        broker_list => "kafkaIPAddress:9092"
    }
}

我从跑步中得到的IP地址 docker inspect kafka2 当我跑的时候 ./bin/logstash agent --config /etc/logstash/conf.d/01-input.conf 我得到这个错误。

Settings: Default pipeline workers: 4
Unknown setting 'broker_list' for kafka {:level=>:error}
Pipeline aborted due to error {:exception=>#<LogStash::ConfigurationError: Something is wrong with your configuration.>, :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/config/mixin.rb:134:in `config_init'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/outputs/base.rb:63:in `initialize'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/output_delegator.rb:74:in `register'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:181:in `start_workers'", "org/jruby/RubyArray.java:1613:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:181:in `start_workers'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:136:in `run'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/agent.rb:473:in `start_pipeline'"], :level=>:error}
stopping pipeline {:id=>"main"}

我已经通过运行下面返回ok的命令检查了文件的配置。

./bin/logstash agent --configtest --config /etc/logstash/conf.d/01-input.conf
Configuration OK

有没有人遇到过这种情况,是不是我还没有打开Kafka集装箱上的港口,如果是这样的话,我如何才能做到这一点,同时保持Kafka运行?

sbdsn5lh

sbdsn5lh1#

错误就在这里 broker_list => "kafkaIPAddress:9092" 尝试 bootstrap_servers => "KafkaIPAddress:9092" 如果容器在不同的机器上,请将kafkaMap到主机 9092 使用主机address:port,如果在同一主机上,请使用内部docker IP:port

相关问题