如何配置sysflow将JSON事件发送到Splunk

vwkv1x7d  于 2023-04-08  发布在  其他
关注(0)|答案(1)|浏览(139)

我是SysFlow的新手,我想将事件发送到Splunk。
问题是,事件到达Splunk时带有元数据前缀,因此Splunk不知道如何自动将事件解释为JSON。
下面是原始事件:

Mar  9 21:57:26 10.0.0.158 1 2023-03-09T23:57:26+02:00 RHEL-Server /usr/bin/sfprocessor 60594 sysflow - {"version":5,"endts":0,"opflags":["EXEC"],"ret":0,"ts":1678399046064475870,"type":"PE","meta":{"schema":4,"tracename":"."},"node":{"id":"RHEL-Server","ip":"10.0.0.158"},"pproc":{"args":"--switched-root --system --deserialize 30","cmdline":"/usr/lib/systemd/systemd --switched-root --system --deserialize 30","createts":1678395155396715155,"entry":true,"exe":"/usr/lib/systemd/systemd","gid":0,"group":"root","name":"systemd","oid":"d2e72642fc1d9df2","pid":1,"tty":false,"uid":0,"user":"root"},"proc":{"acmdline":["/usr/lib/systemd/systemd --switched-root --system --deserialize 30","/usr/lib/systemd/systemd --switched-root --system --deserialize 30"],"aexe":["/usr/lib/systemd/systemd","/usr/lib/systemd/systemd"],"aname":["systemd","systemd"],"apid":[73232,1],"args":"/usr/bin/dnf makecache --timer","cmdline":"/usr/bin/dnf /usr/bin/dnf makecache --timer","createts":1678399046062638027,"entry":false,"exe":"/usr/bin/dnf","gid":0,"group":"root","name":"dnf","oid":"6a55582e3bbddc8a","pid":73232,"tid":73232,"tty":false,"uid":0,"user":"root"},"policies":[{"id":"Unauthorized installer detected","desc":"Use of package installer detected in container","priority":1},{"id":"Unauthorized installer detected","desc":"Use of package installer detected in container","priority":1}],"tags":["actionable-offense","suspicious-process","mitre:T1072"]}

下面是我的配置(/etc/sysflow/pipelines/pipeline.local.json):

{
  "pipeline":[
    {
     "processor": "sysflowreader",
     "handler": "flattener",
     "in": "sysflow sysflowchan",
     "out": "flat flattenerchan"
    },
    {
     "processor": "policyengine",
     "in": "flat flattenerchan",
     "out": "evt eventchan",
     "policies": "/etc/sysflow/policies/runtimeintegrity",
     "mode": "alert"
    },
    {
     "processor": "exporter",
     "in": "evt eventchan",
     "export": "syslog",
     "format": "json",
     "syslog.proto": "udp",
     "syslog.tag": "sysflow",
     "syslog.host": "splunk-server",
     "syslog.port": "514"
    }
  ]
}

我阅读了有关配置的SysFlow文档(在这里)
我知道如何用SPL解析事件,我只需要构建一个插件,它将为我(我需要数据模型中的数据),这里是SPL:

index="sysflow" sourcetype="sysflow:syslog"
| rex field=_raw "^(?:[^ \n]* ){7}(?P<json>.+)"
| spath input=json
js5cn81o

js5cn81o1#

不是最优雅的解决方案。。。但很有效
我在props.conf中创建了一个字段提取:

EXTRACT-json = ^(?:[^ \n]* ){7}(?P<json>.+)

然后使用eval中的spath()函数亲自解析每个字段,例如:

EVAL-file_path = spath(json, "file.path")
EVAL-parent_process = spath(json, "pproc.cmdline")
EVAL-process = spath(json, "proc.cmdline")
EVAL-user = spath(json, "proc.user")
EVAL-type = spath(json, "type")
EVAL-file_name = spath(json, "file.name")
EVAL-process_name = spath(json, "proc.name")
EVAL-parent_process_name = spath(json, "pproc.name")
EVAL-parent_process_path = spath(json, "pproc.exe")
EVAL-dest = spath(json, "node.id")
EVAL-signature = spath(json, "policies{}.id")
EVAL-file_acl = spath(json, "file.openflags{}")
EVAL-action = spath(json, "opflags{}")

我现在正在为splunkbase开发一个附加组件(CIM兼容)

相关问题