问题
I have set up Elk stack on my windows machine with the following :
Elasticserach
Logstash
Kibana
My logstash.conf
input {
file {
path => "\bin\MylogFile.log"
start_position => "beginning"
}
}
output {
elasticsearch {
hosts => localhost:9200
}
}
MylogFile.log(Apache Log)
127.0.0.1 - frank [10/Oct/2000:13:55:36 -0700] "GET /apache_pb.gif HTTP/1.0" 200 2326 "http://www.example.com/start.html" "Mozilla/4.08 [en] (Win98; I ;Nav)"
When I run logstash.conf it creates the following index in elasticsearch :
health status index
yellow open logstash-2016.10.06
The above index is empty and does not get any data from my log file. Please help? I am very new to Elk stack.
When i query the index logstash-2016.10.10 using: http://localhost:9200/logstash-2016.10.10?=pretty=true. I get the following :
"logstash-2016.10.10" : {
"aliases" : { },
"mappings" : {
"_default_" : {
"_all" : {
"enabled" : true,
"omit_norms" : true
},
"dynamic_templates" : [ {
"message_field" : {
"mapping" : {
"index" : "analyzed",
"omit_norms" : true,
"fielddata" : {
"format" : "disabled"
},
"type" : "string"
},
"match" : "message",
"match_mapping_type" : "string"
}
}, {
"string_fields" : {
"mapping" : {
"index" : "analyzed",
"omit_norms" : true,
"fielddata" : {
"format" : "disabled"
},
"type" : "string",
"fields" : {
"raw" : {
"index" : "not_analyzed",
"ignore_above" : 256,
"type" : "string"
}
}
},
回答1:
try adding following lines to your logstash conf and let us know if there are any grokparsing failures...which would mean your pattern used in filter section is not correct..
output {
stdout { codec => json }
file { path => "C:/POC/output3.txt" }
}
in case you see grok parsing failures : try a generic expresion in filter section and slowly refine that to correctly parse your logs
来源:https://stackoverflow.com/questions/39958918/config-file-not-getting-read-by-logstash