问题
I have Nginx as loadbalancer which is generating logs without year and second information in timestamp. One of those logs are
08-10 09:28 root ERROR Error connecting to CRS REST API : [Errno 111] Connection refused
Error connecting to CRS REST API : [Errno 111] Connection refused
The pattern for this is :
(?m)%{MONTHNUM:monthNum}\-%{MONTHDAY:monthDay}\s*%{HOUR:hour}:%{MINUTE:minute}\s*%{WORD}\s*%{LOGLEVEL_CUSTOM:severity}\s*%{GREEDYDATA:messagePayload}
While I understand that year information can be done by Logstash with current year, which is fine with me as logs are not old and collected on daily basis, but as seconds part is important. I am not sure how to do that? One additional thing, I am using date filter to convert it to time stamp to be stored in Elasticsearch which is as follows:
mutate {
add_field => { "timestamp" => "%{monthDay}%{monthNum} %{hour}:{minute}" }
}
date {
match => ["timestamp","ddMM HH:mm", "YYYY-MM-dd HH:mm:ss,SSS", "YYYY-MM-dd HH:mm:ss", "YYYY/MM/dd HH:mm:ss", "dd/MMM/yyyy:HH:mm:ss Z", "ddMMYYYY HH:mm:ss", "ISO8601", "YYYY-MM-dd HH:mm:ss,SSSSSS", "YYYY-MM-dd HH:mm:ss.SSS", "YYYY-MM-dd HH:mm:ss.SSSSSS"]
locale => "en"
timezone => "UTC"
target => "@timestamp"
}
It is NOT parsing it correctly and time stamp is not same in Elasticsearch, not both date and time. Any leads?
回答1:
The discussion can be found here: https://discuss.elastic.co/t/nginx-logs-without-year-and-seconds-information/57780
来源:https://stackoverflow.com/questions/38891821/nginx-logs-without-year-and-seconds-information