I am new to logstash. I have some logs stored in AWS S3 and I am able to import them to logstash. My question is: is it possible to use the grok filter to add tags based on
With Logstash 6.0.1, I was able to get key for each file from S3. In your case, you can use this key (or path) in filter to add tags.
Example:
input {
s3 {
bucket => "<bucket-name>"
prefix => "<prefix>"
}
}
filter {
mutate {
add_field => {
"file" => "%{[@metadata][s3][key]}"
}
}
...
}
Use this above file field in filter to add tags.
Reference:
Look for eye8 answer in this issue
There is no "path" in S3 inputs. I mount the S3 storage on my server and use the file inputs. With file inputs, I can use the filter to match the path now.
If you want to use tags based on filename, I think that this will work (I have not test it):
filter {
grok {
match => [ "path", "%{GREEDYDATA:content}"]
}
mutate {
add_tag => ["content"]
}
}
"content" tag will be the filename, now it's up to you to modify the pattern to create differents tags with the specific part of the filename.