问题
I have two URLs (due to security concern i will explain by using dummy)
a> https://xyz.company.com/ui/api/token
b> https://xyz.company.com/request/transaction?date=2016-01-21&token=<tokeninfo>
When you hit url mentioned in point 'a' it will generate a token let it be a string of 16 characters
Then that token should be used in making second request of point 'b' in token param
Updated
The second url response is important to me i.e is a JSON response, I need
to filter the json data and extract required data and output it to standard
output and elastic search.
is there any way of doing so in logstash using plugin "http_poller" or any other plugins.
Note : these request urls should be executed one after another, i.e point 'a' url should be executed first and point 'b' url should be executed next after receiving new token.
Please suggest.
回答1:
Yes, it's possible with a mix of an http_poller
input and an http
output.
Here is the config I came up with:
input {
# 1. trigger new token requests every hour
http_poller {
urls => {
token => "https://xyz.company.com/ui/api/token"
}
interval => 3600
add_field => {"token" => "%{message}"}
}
}
filter {
}
output {
# 2. call the API
http {
http_method => "get"
url => "https://xyz.company.com/request/transaction?date=2016-01-21&token=%{token}"
}
}
UPDATE
If you want to be able to get the content of the API call and store it in ES, you need a hybrid solution. You need to set up a cron that will call a script that runs the two HTTP calls and stores the results in a file and then you can let logstash tail that file and forward the results to ES.
Shell script to put on cron:
#!/bin/sh
# 1. Get the token
TOKEN=$(curl -s -XGET https://xyz.company.com/ui/api/token)
# 2. Call the API with the token and append JSON to file
curl -s -XGET "https://xyz.company.com/request/transaction?date=2016-01-21&token=$TOKEN" >> api_calls.log
The above script can be set on cron using crontab (or similar), there are plenty of examples out there on how to achieve this.
Then the logstash config can be very simple. It just needs to tail the api_calls.log
file and send the document to ES
input {
file {
path => "api_calls.log"
start_position => "beginning"
}
}
filter {
json {
source => "message"
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "my_index"
document_type" => "my_type"
}
stdout {
codec => "rubydebug"
}
}
来源:https://stackoverflow.com/questions/37436376/logstash-http-poller-first-url-requests-response-should-be-input-to-second-url