Logstash: how to use filter to match filename when using s3

后端 未结 3 1688
鱼传尺愫
鱼传尺愫 2021-01-14 21:44

I am new to logstash. I have some logs stored in AWS S3 and I am able to import them to logstash. My question is: is it possible to use the grok filter to add tags based on

相关标签:
3条回答
  • 2021-01-14 22:17

    With Logstash 6.0.1, I was able to get key for each file from S3. In your case, you can use this key (or path) in filter to add tags.

    Example:

    input {
        s3 {
            bucket => "<bucket-name>"
            prefix => "<prefix>"
        }
    }
    
    filter {
        mutate {
            add_field => {
                "file" => "%{[@metadata][s3][key]}"
            }
        }
        ...
    }
    

    Use this above file field in filter to add tags.

    Reference:

    Look for eye8 answer in this issue

    0 讨论(0)
  • 2021-01-14 22:20

    There is no "path" in S3 inputs. I mount the S3 storage on my server and use the file inputs. With file inputs, I can use the filter to match the path now.

    0 讨论(0)
  • 2021-01-14 22:28

    If you want to use tags based on filename, I think that this will work (I have not test it):

    filter {
      grok {
        match => [ "path", "%{GREEDYDATA:content}"]   
      }     
      mutate {
        add_tag => ["content"]
      }
    }
    

    "content" tag will be the filename, now it's up to you to modify the pattern to create differents tags with the specific part of the filename.

    0 讨论(0)
提交回复
热议问题