What should be the grok pattern for thoses logs ? (ingest pipeline for filebeat)

烂漫一生 提交于 2019-12-12 18:19:44

问题


I'm new in the elasticsearch community and I would like your help on something I'm struggeling with. My goal is to send huge quantity of log files to Elasticsearch using Filebeat. In order to do that I need to parse data using ingest nodes with Grok pattern processor. Without doing that, all my logs are not exploitable as each like fall in the same "message" field. Unfortunately I have some issues with the grok regex and I can't find the problem as It's the first time I work with that. My logs look like that:

2016-09-01T10:58:41+02:00 INFO (6):     165.225.76.76   entreprise1 email1@gmail.com    POST    /application/controller/action  Mozilla/5.0 (Windows NT 6.1; Trident/7.0; rv:11.0) like Gecko   {"getid":"1"}   86rkt2dqsdze5if1bqldfl1
2016-09-01T10:58:41+02:00 INFO (6):     165.225.76.76   entreprise2 email2@gmail.com    POST    /application/controller/action  Mozilla/5.0 (Windows NT 6.1; Trident/7.0; rv:11.0) like Gecko   {"getid":"2"}   86rkt2rgdgdfgdfgeqldfl1
2016-09-01T10:58:41+02:00 INFO (6):     165.225.76.76   entreprise3 email3@gmail.com    POST    /application/controller/action  Mozilla/5.0 (Windows NT 6.1; Trident/7.0; rv:11.0) like Gecko   {"getid":"2"}

So we have tabs as separator, and those fields: date, ip, company_name, email, method(post,get), url, browser, json_request, optional_code

My ingest pipeline json looks like that:

PUT _ingest/pipeline/elastic_log_index

    {
      "description" : "Convert logs txt files",
      "processors" : [
        {
          "grok": {
            "field": "message",
            "patterns": ["%{TIMESTAMP_ISO8601:timestamp} %{IP:ip} %{WORD:company}% {EMAILADDRESS:email} %{URIPROTO:method} %{URIPATH:page} %{WORD:browser} %{WORD:code}"]

          }
        },
        {
          "date" : {
            "field" : "timestamp",
            "formats" : ["yyyy-MM-ddTHH:mm:ss INFO(6):"]
          }
        }
      ],
      "on_failure" : [
        {
          "set" : {
            "field" : "error",
            "value" : " - Error processing message - "
          }
        }
      ]
    }

This does not work.

1) How can I escape character(s) ? For example "INFO (6):" at the end of timestamp

2) Can I just use space between field in my gork pattern ? Separators in files log are tabs.

3) The code at the end of lines is not always present in logs, can this be a problem ?

4) Do you have ideas why this configuration doesnt parse in anyway my logs document under elasticsearch ?

Thanks a lot for your help and excuse my level of english I'm french.


回答1:


Your grok pattern doesn't match everything in your log which is why it doesn't work. For instance, %{WORD} will only match Mozilla, not /5.0. You can create custom pattern to match entire browser/version like this (?<browser>%{WORD}(/%{NUMBER})?).

You can escape INFO (6): by simply matching it with .* and it will be ignored in the output.

As far as the spaces are concerned, please match them using predefined grok pattern %{SPACE}.

code in the end can become optional by creating a custom pattern, i.e. (?<optional_code>%{WORD}?)

Your entire grok pattern will then become,

%{TIMESTAMP_ISO8601:timestamp}.*%{IP:ip}%{SPACE}%{WORD:company_name}%{SPACE}%{EMAILADDRESS:email}%{SPACE}%{URIPROTO:method}%{SPACE}%{URIPATH:page}%{SPACE}(?<browser>%{WORD}(/%{NUMBER})?)%{SPACE}\(%{GREEDYDATA:content}\).*\{%{GREEDYDATA:json}\}%{SPACE}(?<optional_code>%{WORD}?)

It will output,

{
  "timestamp": [
    [
      "2016-09-01T10:58:41+02:00"
    ]
  ],
  "YEAR": [
    [
      "2016"
    ]
  ],
  "MONTHNUM": [
    [
      "09"
    ]
  ],
  "MONTHDAY": [
    [
      "01"
    ]
  ],
  "HOUR": [
    [
      "10",
      "02"
    ]
  ],
  "MINUTE": [
    [
      "58",
      "00"
    ]
  ],
  "SECOND": [
    [
      "41"
    ]
  ],
  "ISO8601_TIMEZONE": [
    [
      "+02:00"
    ]
  ],
  "ip": [
    [
      "165.225.76.76"
    ]
  ],
  "IPV6": [
    [
      null
    ]
  ],
  "IPV4": [
    [
      "165.225.76.76"
    ]
  ],
  "SPACE": [
    [
      "   ",
      " ",
      "    ",
      "    ",
      "  ",
      " ",
      "   "
    ]
  ],
  "company_name": [
    [
      "entreprise1"
    ]
  ],
  "email": [
    [
      "email1@gmail.com"
    ]
  ],
  "EMAILLOCALPART": [
    [
      "email1"
    ]
  ],
  "HOSTNAME": [
    [
      "gmail.com"
    ]
  ],
  "method": [
    [
      "POST"
    ]
  ],
  "page": [
    [
      "/application/controller/action"
    ]
  ],
  "browser": [
    [
      "Mozilla/5.0"
    ]
  ],
  "WORD": [
    [
      "Mozilla",
      "86rkt2dqsdze5if1bqldfl1"
    ]
  ],
  "NUMBER": [
    [
      "5.0"
    ]
  ],
  "BASE10NUM": [
    [
      "5.0"
    ]
  ],
  "content": [
    [
      "Windows NT 6.1; Trident/7.0; rv:11.0"
    ]
  ],
  "json": [
    [
      ""getid":"1""
    ]
  ],
  "optional_code": [
    [
      "86rkt2dqsdze5if1bqldfl1"
    ]
  ]
}

When testing online please add custom patterns for email, as they are currently not supported,

EMAILLOCALPART [a-zA-Z][a-zA-Z0-9_.+-=:]+
EMAILADDRESS %{EMAILLOCALPART}@%{HOSTNAME}


来源:https://stackoverflow.com/questions/50816580/what-should-be-the-grok-pattern-for-thoses-logs-ingest-pipeline-for-filebeat

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!