使用docker安装logstash
想要是:logstash消费Kafka和标准输出的数据到控制台
规格
Kafka0.8.2.1
日志2.3.2
ElasticSearch2.3.3
Docker
日志存储代码
input {
kafka {
zk_connect => 'remoteZookeeperServer:2181'
topic_id => 'testTopic'
}
}
output {
stdout { codec => rubydebug }
}
docker日志
{:timestamp=>"2017-03-31T09:13:01.120000+0000", :message=>"Pipeline main started"}
// finished
docker日志(使用调试模式执行日志存储)
{:timestamp=>"2017-03-31T08:23:41.987000+0000", :message=>"Reading config file", :config_file=>"/config-dir/logstash-wcs.conf", :level=>:debug, :file=>"logstash/config/loader.rb", :line=>"69", :method=>"local_config"}
{:timestamp=>"2017-03-31T08:23:42.022000+0000", :message=>"Plugin not defined in namespace, checking for plugin file", :type=>"input", :name=>"kafka", :path=>"logstash/inputs/kafka", :level=>:debug, :file=>"logstash/plugin.rb", :line=>"76", :method=>"lookup"}
{:timestamp=>"2017-03-31T08:23:42.363000+0000", :message=>"Plugin not defined in namespace, checking for plugin file", :type=>"codec", :name=>"json", :path=>"logstash/codecs/json", :level=>:debug, :file=>"logstash/plugin.rb", :line=>"76", :method=>"lookup"}
{:timestamp=>"2017-03-31T08:23:42.370000+0000", :message=>"config LogStash::Codecs::JSON/@charset = \"UTF-8\"", :level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"153", :method=>"config_init"}
...
// cannot find warn or error
...
{:timestamp=>"2017-03-31T08:23:42.549000+0000", :message=>"Will start workers for output", :worker_count=>1, :class=>LogStash::Outputs::Stdout, :level=>:debug, :file=>"logstash/output_delegator.rb", :line=>"77", :method=>"register"}
{:timestamp=>"2017-03-31T08:23:42.553000+0000", :message=>"Starting pipeline", :id=>"main", :pipeline_workers=>4, :batch_size=>125, :batch_delay=>5, :max_inflight=>500, :level=>:info, :file=>"logstash/pipeline.rb", :line=>"188", :method=>"start_workers"}
{:timestamp=>"2017-03-31T08:23:42.561000+0000", :message=>"Pipeline main started", :file=>"logstash/agent.rb", :line=>"465", :method=>"start_pipeline"}
{:timestamp=>"2017-03-31T08:23:47.565000+0000", :message=>"Pushing flush onto pipeline", :level=>:debug, :file=>"logstash/pipeline.rb", :line=>"458", :method=>"flush"}
{:timestamp=>"2017-03-31T08:23:52.566000+0000", :message=>"Pushing flush onto pipeline", :level=>:debug, :file=>"logstash/pipeline.rb", :line=>"458", :method=>"flush"}
{:timestamp=>"2017-03-31T08:23:57.568000+0000", :message=>"Pushing flush onto pipeline", :level=>:debug, :file=>"logstash/pipeline.rb", :line=>"458", :method=>"flush"}
{:timestamp=>"2017-03-31T08:24:02.570000+0000", :message=>"Pushing flush onto pipeline", :level=>:debug, :file=>"logstash/pipeline.rb", :line=>"458", :method=>"flush"}
{:timestamp=>"2017-03-31T08:24:07.573000+0000", :message=>"Pushing flush onto pipeline", :level=>:debug, :file=>"logstash/pipeline.rb", :line=>"458", :method=>"flush"}
...
通知
当我使用kafka2.9.1-0.8.2.1控制台console consumer对同一个zookeeper服务器进行测试时,consumer使用了数据。
询问
我认为logstash无法连接到zookeeper服务器并使用kafka的数据。
有什么问题吗?
为什么logstash会制作与zookeeper服务器连接的日志?
暂无答案!
目前还没有任何答案,快来回答吧!