Sending json format log to kibana using filebeat, logstash and elasticsearch?

后端 未结 3 1930
既然无缘
既然无缘 2021-02-09 10:26

I have logs like this:

{\"logId\":\"57aaf6c8d32fb\",\"clientIp\":\"127.0.0.1\",\"time\":\"03:11:29 pm\",\"uniqueSubId\":\"57aaf6c98963b\",\"channelName\":\"JSPC\         


        
3条回答
  •  长发绾君心
    2021-02-09 11:02

    To parse JSON log lines in Logstash that were sent from Filebeat you need to use a json filter instead of a codec. This is because Filebeat sends its data as JSON and the contents of your log line are contained in the message field.

    Logstash config:

    input {
      beats {
        port => 5044
      }   
    }   
    
    filter {
      if [tags][json] {
        json {
          source => "message"
        }   
      }   
    }   
    
    output {
      stdout { codec => rubydebug { metadata => true } } 
    }
    

    Filebeat config:

    filebeat:
      prospectors:
        - paths:
            - my_json.log
          fields_under_root: true
          fields:
            tags: ['json']
    output:
      logstash:
        hosts: ['localhost:5044']
    

    In the Filebeat config, I added a "json" tag to the event so that the json filter can be conditionally applied to the data.

    Filebeat 5.0 is able to parse the JSON without the use of Logstash, but it is still an alpha release at the moment. This blog post titled Structured logging with Filebeat demonstrates how to parse JSON with Filebeat 5.0.

提交回复
热议问题