我正在从oracle读取数据,并通过logstash将数据放入elasticsearch。但是,从oracle读取数据时,与日期类型对应的字段的值会自动更改为utc类型,与原始数据不同。我知道日期值是由utc在elasticsearch中管理的。原始数据的值为2020-12-29 00:48:00,但会自动更改为2020-12-28t15:28:00.000z。因此,logstash不能知道原始数据值,也不能操作filter插件的数据。有没有办法保留原始数据?此外,更改utc中的原始数据是否可以更改“yyyy-mm-dd't'hh:mm:ss.sss+0900”?
我的日志存储配置是
input {
jdbc {
jdbc_validate_connection => true
jdbc_driver_library => "<driver>"
jdbc_driver_class => "Java::oracle.jdbc.OracleDriver"
jdbc_connection_string => "<connection>"
jdbc_user => "<user>"
jdbc_password => "<password>"
jdbc_paging_enabled => true
tracking_column => "unix_ts_in_secs"
tracking_column_type => "numeric"
use_column_value => true
statement => "SELECT * FROM table"
schedule => "*/1 * * * *"
charset => "UTF-8"
enable_metric => false
last_run_metadata_path => "path"
}
}
filter {
mutate {
copy => { "id" => "[@metadata][_id]"}
remove_field => ["id", "@version", "unix_ts_in_secs", "ip_num"]
convert => {
"deleted" => "boolean"
}
}
}
output {
stdout {
codec => rubydebug
}
elasticsearch {
hosts => ["eshost"]
index => "index"
document_id => "%{[@metadata][_id]}"
}
}
暂无答案!
目前还没有任何答案,快来回答吧!