Decompose Logstash json message into fields

前端 未结 4 1780
無奈伤痛
無奈伤痛 2021-02-01 21:46

It have a logfile that stores event with a timestamp and a json message. For example:

timestamp {\"foo\": 12, \"bar\": 13}

I would l

相关标签:
4条回答
  • 2021-02-01 22:03

    Try the latest logstash 1.2.1 and use codec value to parse json events directly.

    input {
        file {
            type => "tweetfile"
            path => ["/home/nikhil/temp/feed/*.txt"]
            codec => "json"
        }
    }
    filter{
        json{
            source => "message"
            target => "tweet"
        }
    }
    output {
        stdout { }
        elasticsearch { embedded => true }
    }
    
    0 讨论(0)
  • 2021-02-01 22:07

    I've done this with the following config:

    filter {
      grok {
        match => ["message", "\[%{WORD}:%{LOGLEVEL}\] %{TIMESTAMP_ISO8601:tstamp} :: %{GREEDYDATA:msg}"]
      }
      date {
        match => [ "tstamp", "yyyy-MM-dd HH:mm:ss" ]
      }
      json {
        source => "msg"
      }
    }
    

    By the way, this is a config for the new version 1.2.0.

    In version 1.1.13 you need to include a target on the json filter and the reference for message in the grok filter is @message.

    0 讨论(0)
  • 2021-02-01 22:11

    your JSON is wrong {"foo": 12, "bar" 13}

    should be:

    {"foo": 12, "bar": 13}

    0 讨论(0)
  • 2021-02-01 22:24

    You can just use plain Grok filters (regex style filters/patterns) and assign the matched value into a variable for easy organization, filtering and searching.

    An example:

    ((?<foo_identifier>(\"foo\"))):((?<foo_variable_value>(\d+,)))
    

    Something along those lines.

    Use the GrokDebugger to help out if you get stuck on the syntax, patterns and things you think should be matching but aren't.

    Hope that helps a bit.

    0 讨论(0)
提交回复
热议问题