logstash-configuration

Logstash Capture http response from http output plugin

爷,独闯天下 提交于 2019-12-08 11:15:10
问题 I have wrote a logstash program to post some message to URL. there is no error from logstash but i wanted to know is there a way to capture response from the url what we post using http output plugin ? output { stdout { codec => json_lines } http{ url => "Rest URl" http_method => "post" format => "json" headers => {"Authorization" => "%{pass}"} } } I had gone through the documentation but i didnot find anything realted to it. Forgive me if it is dumb question. 回答1: There are no dumb questions

Mask middle 6 digits of credit card number in logstash

南笙酒味 提交于 2019-12-08 11:14:57
问题 The requirement is to show the start 6 digits and last 4 digits and mask the remaining numbers of credit card in logstash. I applied gsub/mutate filter but the replacement string doesn't allow regex. Any other way this can be done in logstash? if [message] =~ '\d{16}' { mutate { gsub => ["message", "\d{6}\d{4}\d{4}", "\d{6}######\d{4}"] add_tag => "Masked CardNo" } } This code masks the credit card 3456902345871092 to \d{6}######\d{4} but it should be masked as 345690######1092. As an

Combining log entries with logstash

北城以北 提交于 2019-12-08 10:04:23
问题 I want to collect and process logs from dnsmasq and I´ve decided to use ELK. Dnsmasq is used as a DHCP Server and as a DNS Resolver and hence it creates log entries for both services. My goal is to send to Elasticsearch all DNS Queries with the requester IP, requester hostname (if available) and requester mac address. That will allow me to group the request per mac address regardless if the device IP changed or not, and display the host name. What I would like to do is the following: 1) Read

Logstash expected one of #

依然范特西╮ 提交于 2019-12-07 12:23:29
问题 I'm currently trying to run Lostash with the following config file: input { stdin { } } output { rabbitmq { exchange => "test_exchange" exchange_type => "fanout" host => "172.17.x.x" } } I do however get an error: logstash agent --configtest -f -config.conf gives me: Error: Expected one of #, } at line 1, column 105 (byte 105) after output { rabbitmq { exchange => test_exchange exchange_type => fanout host => 172.17 It seems that logstash has the problem when I put an IP-like address in the

logstash 5.0.1: setup elasticsearch multiple indexes ouput for multiple kafka input topics

瘦欲@ 提交于 2019-12-06 11:43:39
问题 I have a logstash input setup as input { kafka { bootstrap_servers => "zookeper_address" topics => ["topic1","topic2"] } } I need to feed the topics into two different indexes in elasticsearch. Can anyone help me with how the ouput should be setup for such a task. At this time I am only able to setup output { elasticsearch { hosts => ["localhost:9200"] index => "my_index" codec => "json" document_id => "%{id}" } } I need two indexes on the same elasticsearch instance say index1 and index2

Logstash expected one of #

泪湿孤枕 提交于 2019-12-06 03:46:19
I'm currently trying to run Lostash with the following config file: input { stdin { } } output { rabbitmq { exchange => "test_exchange" exchange_type => "fanout" host => "172.17.x.x" } } I do however get an error: logstash agent --configtest -f -config.conf gives me: Error: Expected one of #, } at line 1, column 105 (byte 105) after output { rabbitmq { exchange => test_exchange exchange_type => fanout host => 172.17 It seems that logstash has the problem when I put an IP-like address in the host field. What is wrong with my config? The whole problem was in the method you used when created the

How to authenticate Logstash output to a secure Elasticsearch URL (version 5.6.5)

坚强是说给别人听的谎言 提交于 2019-12-05 22:23:22
I am using Logstash and Elasticsearch versions 5.6.5. So far used elasticsearch output with HTTP protocol and no authentication. Now Elasticsearch is being secured using basic authentication (user/password) and CA certified HTTPS URL. I don't have any control over the elasticsearch server. I just use it to output to from Logstash. Now when I try to configure the HTTPS URL of elasticsearch with basic authentication, it fails to create the pipeline. Output Configuration output { elasticsearch { hosts => ["https://myeslasticsearch.server.io"] user => "esusername" password => "espassword" ssl =>

Retrieving RESTful GET parameters in logstash

南笙酒味 提交于 2019-12-05 13:37:57
I am trying to get logstash to parse key-value pairs in an HTTP get request from my ELB log files. the request field looks like http://aaa.bbb/get?a=1&b=2 I'd like there to be a field for a and b in the log line above, and I am having trouble figuring it out. My logstash conf (formatted for clarity) is below which does not load any additional key fields. I assume that I need to split off the address portion of the URI, but have not figured that out. input { file { path => "/home/ubuntu/logs/**/*.log" type => "elb" start_position => "beginning" sincedb_path => "log_sincedb" } } filter { if

logstash 5.0.1: setup elasticsearch multiple indexes ouput for multiple kafka input topics

帅比萌擦擦* 提交于 2019-12-04 18:58:04
I have a logstash input setup as input { kafka { bootstrap_servers => "zookeper_address" topics => ["topic1","topic2"] } } I need to feed the topics into two different indexes in elasticsearch. Can anyone help me with how the ouput should be setup for such a task. At this time I am only able to setup output { elasticsearch { hosts => ["localhost:9200"] index => "my_index" codec => "json" document_id => "%{id}" } } I need two indexes on the same elasticsearch instance say index1 and index2 which will be fed by messages coming in on topic1 and topic2 First, you need to add decorate_events to

Environment Variable replacement in Logstash when running as a service in Ubuntu

久未见 提交于 2019-12-04 06:14:19
问题 I understand that logstash now supports environment variables in config, as described here. But I can't seem to get it working when running logstash as a service. I am on ubuntu 14.04, logstash 1:2.3.3-1, and I launch logstash with sudo service logstash start . A final twist is that I am including logstash in a docker container, and I do NOT want to hardcode the variable value in my Dockerfile, I want it to ultimately be sourced from the command line when I launch the container, e.g. docker