logstash-configuration

logstash json post output

北战南征 提交于 2019-12-12 04:34:46
问题 I am current trying to do a JavaScript post to Logstash by using a tcp input. JavaScript Post xhr = new XMLHttpRequest(); var url = "http://localhost:5043"; xhr.open("POST", url, true); xhr.setRequestHeader("Content-type", "application/json"); var data = JSON.stringify({"test" : hello}); xhr.send(data); Logstash config file input { tcp { port => 5043 } } filter{ } output { stdout { codec => rubydebug } } Output in console { "message" => "OPTIONS / HTTP/1.1\r", "@version" => "1", "@timestamp"

How to use regex for config files in this use case?

我只是一个虾纸丫 提交于 2019-12-12 04:25:54
问题 I am using LogStash which accepts data from a log file, which has different types of logs. I tried this: filter { grok { match => { "message" => "%{WORD:tag} %{WORD:message} %{WORD:value} } } But it doesn't work. 回答1: I am using the grok filter to check if the log line is of one format. If the grok filter cannot parse the log line (such as with the json lines), _grokparsefailure will be added to the tags. You can then use this tag to differentiate between the two log type. filter { grok {

Logstash with elasticsearch input loops results and always gives the error ” Failed to parse request body ”

爱⌒轻易说出口 提交于 2019-12-12 04:22:22
问题 I'm getting this error no matter how I tweak my very basic logstash configuration: Error: [400] {"error":{"root_cause":[{"type":"illegal_argument_exception","reason":"Failed to parse request body"}],"type":"illegal_argument_exception","reason":"Failed to parse request body","caused_by":{"type":"json_parse_exception","reason":"Unrecognized token '**************': was expecting ('true', 'false' or 'null')\n at [Source: org.elasticsearch.transport.netty4.ByteBufStreamInput@6d67a6bb; line: 1,

configure TTL in elastic search with index template

北战南征 提交于 2019-12-12 04:12:38
问题 I have requirement to only store data for 10 days in elastic search which is coming through logstash. As I don't have too much data so I am taking approach for setting up TTL through index template. Could anybody let me know what is exactly I have to do. I can go for creation of index template and in template file I have kept the following code in default .json file { "_ttl" : { "enabled" : true, "default" : "10d" } } But I am not sure where to keep this file and how that file is getting

Configuration with output file and codec not parsed by logstash

杀马特。学长 韩版系。学妹 提交于 2019-12-12 00:42:52
问题 I'm trying a "simple" logstash configuration and want to ouput on a file to check. So I took the conf from https://www.elastic.co/guide/en/logstash/current/plugins-outputs-file.html and put it in my conf: input { file { exclude => ['*.gz'] path => ['/var/log/*.log'] type => 'system logs' } syslog { port => 5000 } } output { elasticsearch { hosts => ['elasticsearch'] } file { path => "/config/logstash_out.log" codec => { line { format => "message: %{message}" } } } stdout {} } but when I

Combining multiple events in Logstash

大憨熊 提交于 2019-12-11 19:46:13
问题 I have a Logstash configuration where I'm reading simple lines from a graphite input (but if that helps it might as well just be tcp) and i'm forwarding them to RabbitMQ via AMQP. input { graphite { host => localhost type => carbon port => 22003 } } output { rabbitmq { codec => json host => 'localhost' port => 5672 user => 'guest' password => 'guest' vhost => '/' exchange_type => topic key => '%{type}' persistent => true durable => true ssl => false verify_ssl => false workers => 1 exchange =

Converting epoch time to date in logstash using ruby filter

∥☆過路亽.° 提交于 2019-12-11 17:25:55
问题 I have a field name "timestamp" in my configuration. It holds an array of data in epoch time (miliseconds). I want to use Ruby filter to convert each epoch time in the array and convert into Date format consumable by Kibana. I am trying to convert each date field and store in a new field as an array. I am getting syntax errors. Can anyone help me out ? I am new to Ruby. ruby { code => {' event.get("timestamp").each do |x| { event["timestamp1"] = Time.at(x) } '} } 回答1: I don't know about

convert string to array based on pattern in logstash

不想你离开。 提交于 2019-12-11 17:24:28
问题 My original data. { message: { data: "["1,2","3,4","5,6"]" } } Now I want to convert value of data field to an array. So it should become: { message: { data: ["1,2", "3,4", "5,6"] } } By using mutate { gsub => ["data", "[\[\]]", ""] } I got rid of square brackets. After this, I tried splitting based on commas. But that won't work. Since my data has commas as well. I tried writing a dissect block but that is not useful. So how should I go ahead with this? 回答1: Have you tried the json filter?

Unable to get the parse value out of multi-line logs in logstash

大兔子大兔子 提交于 2019-12-11 14:12:29
问题 I am using Logstash to output JSON message to an API. On Simple Log lines, my grok pattern and configurations are working absolutely fine, But I am unable to get the values dynamically out during exceptions and stacktraces. Log File : TID: [-1234] [] [2016-06-07 12:52:59,862] INFO {org.apache.synapse.core.axis2.ProxyService} - Successfully created the Axis2 service for Proxy service : TestServiceHttp {org.apache.synapse.core.axis2.ProxyService} TID: [-1234] [] [2016-06-07 12:59:04,893] INFO

Logstash grok filter config for php monolog multi-line(stacktrace) logs

﹥>﹥吖頭↗ 提交于 2019-12-11 11:57:45
问题 [2018-02-12 09:15:43] development.WARNING: home page [2018-02-12 09:15:43] development.INFO: home page [2018-02-12 10:22:50] development.WARNING: home page [2018-02-12 10:22:50] development.INFO: home page [2018-02-12 10:22:50] development.ERROR: Call to undefined function vie() {"exception":"[object](Symfony\\Component\\Debug\\Exception\\FatalThrowableError(code: 0): Call to undefined function vie() at /var/www/html/routes/web.php:16 [stacktrace] #0 /var/www/html/vendor/laravel/framework/src