elasticsearch 如何在kibana中查看日志

dgjrabp2  于 2022-11-02  发布在  ElasticSearch
关注(0)|答案(1)|浏览(532)

我是ELK的新手,我使用net. logstash. logback. appender. LogstashTcpSocketAppender尝试了带有springboot的ELK堆栈。我向logstack发送了json消息。下面是我的配置-

日志返回-Spring.xml

<configuration>
    <include resource="org/springframework/boot/logging/logback/defaults.xml" />
​   <springProperty scope="context" name="springAppName" source="spring.application.name" />

    <property name="LOG_FILE" value="./${springAppName}" />

    <property name="CONSOLE_LOG_PATTERN"
        value="%clr(%d{yyyy-MM-dd HH:mm:ss.SSS}){faint} %clr(${LOG_LEVEL_PATTERN:-%5p}) %clr(${PID:- }){magenta} %clr(---){faint} %clr([%15.15t]){faint} %clr(%-40.40logger{39}){cyan} %clr(:){faint} %m%n${LOG_EXCEPTION_CONVERSION_WORD:-%wEx}" />

    <appender name="logstash2"
        class="net.logstash.logback.appender.LogstashTcpSocketAppender">
        <destination>localhost:5000</destination>
        <encoder
            class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder">
            `
            <providers>
                <timestamp>
                    <timeZone>UTC</timeZone>
                </timestamp>
                <pattern>
                    <pattern>
                        {
                        "severity": "%level",
                        "service": "${springAppName:-}",
                        "trace": "%X{X-B3-TraceId:-}",
                        "span": "%X{X-B3-SpanId:-}",
                        "parent": "%X{X-B3-ParentSpanId:-}",
                        "exportable":
                        "%X{X-Span-Export:-}",
                        "pid": "${PID:-}",
                        "thread": "%thread",
                        "class": "%logger{40}",
                        "rest": "%message"
                        }
                    </pattern>
                </pattern>
            </providers>
        </encoder>
        <keepAliveDuration>5 minutes</keepAliveDuration>
    </appender>
    ​
    <root level="INFO">
        <appender-ref ref="logstash" />
    </root>
</configuration>

配置json

input{
    tcp{
        port=> 5000
        host=> localhost
    }
}   

filter {
       # pattern matching logback pattern
       grok {
              match => { "message" => "%{TIMESTAMP_ISO8601:timestamp}\s+%{LOGLEVEL:severity}\s+\[%{DATA:service},%{DATA:trace},%{DATA:span},%{DATA:exportable}\]\s+%{DATA:pid}\s+---\s+\[%{DATA:thread}\]\s+%{DATA:class}\s+:\s+%{GREEDYDATA:rest}" }
       }
}

output {
    elasticsearch { hosts => ["localhost:9200"] }
}

但是当我打开Kibana查看消息时,我看到整个日志都是消息。如下所示-

有人能帮助我实现以下输出吗-

wlp8pajw

wlp8pajw1#

您的筛选器块应如下所示:

filter {
       # pattern matching logback pattern
       grok {
              match => { "message" => "%{TIMESTAMP_ISO8601:timestamp}\s+%{LOGLEVEL:severity}\s+\[%{DATA:service},%{DATA:trace},%{DATA:span},%{DATA:exportable}\]\s+%{DATA:pid}\s+---\s+\[%{DATA:thread}\]\s+%{DATA:class}\s+:\s+%{GREEDYDATA:rest}" }
       }
       json{
              source => "message"
       }
}

我不明白你为什么不在输出块中使用索引命名?如果你有多个索引,你会遇到问题。添加如下内容:

output {
    elasticsearch { 
         hosts => ["localhost:9200"] 
         index => "YOUR_INDEX_NAME-%{+YYYY.MM.dd}"
    }
}

相关问题