Add extra value to field before sending to elasticsearch

十年热恋 提交于 2019-12-11 16:52:43

问题


I'm using logstash, filebeat and grok to send data from logs to my elastisearch instance. This is the grok configuration in the pipe

filter {
    grok {
        match => {
            "message" => "%{SYSLOGTIMESTAMP:messageDate} %{GREEDYDATA:messagge}"
        }
    }
}

This works fine, the issue is that messageDate is in this format Jan 15 11:18:25 and it doesn't have a year entry.
Now, i actually know the year these files were created in and i was wondering if it is possible to add the value to the field during the process, that is, somehow turn Jan 15 11:18:25 into 2016 Jan 15 11:18:25 before sending to elasticsearch (obviously without editing the files, which i could do and even with ease but it'll be a temporary fix to what i have to do and not a definitive solution)

I have tried googling if it was possible but no luck...


回答1:


Valepu,

The only way to modify the data from a field is using the ruby filter:

filter {
  ruby {

    code => "#your code here#"
  }
}

For more information like...how to get,set field values, here is the link:

https://www.elastic.co/guide/en/logstash/current/plugins-filters-ruby.html




回答2:


If you have a separate field for date as a string, you can use logstash date plugin:

https://www.elastic.co/guide/en/logstash/current/plugins-filters-date.html

If you don't have it as a separate field (as in this case) use this site to construct your own grok pattern:

http://grokconstructor.appspot.com/do/match

I made this to preprocess the values:

%{YEAR:yearVal} %{MONTH:monthVal} %{NUMBER:dayVal} %{TIME:timeVal} %{GREEDYDATA:message}

Not the most elegant I guess, but you get the values in different fields. Using this you can create your own date field and parse it with date filter so you will get a comparable value or you can use these fields by themselves. I'm sure there is a better solution, for example you could make your own grok pattern and use that, but I'm gonna leave some exploration for you too. :)




回答3:


By reading thoroughly the grok documentation i found what google couldn't find for me and which i apparently missed the first time i read that page

https://www.elastic.co/guide/en/logstash/current/plugins-filters-grok.html#plugins-filters-grok-add_field

Using the add_field and remove_field options i managed to add the year to my date, then i used the date plugin to send it to logstash as a timestamp. My filter configuration now looks like this

filter {
    grok {
        match => {
            "message" => "%{SYSLOGTIMESTAMP:tMessageDate} %{GREEDYDATA:messagge}"
            add_field => { "messageDate" => "2016 %{tMessageDate}" }
            remove_field => ["tMessageDate"]
        }
    }
    date {
        match => [ "messageDate", "YYYY MMM dd HH:mm:ss"]
    }
}

And it worked fine



来源:https://stackoverflow.com/questions/48432232/add-extra-value-to-field-before-sending-to-elasticsearch

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!