Sending json format log to kibana using filebeat, logstash and elasticsearch?

后端 未结 3 1931
既然无缘
既然无缘 2021-02-09 10:26

I have logs like this:

{\"logId\":\"57aaf6c8d32fb\",\"clientIp\":\"127.0.0.1\",\"time\":\"03:11:29 pm\",\"uniqueSubId\":\"57aaf6c98963b\",\"channelName\":\"JSPC\         


        
相关标签:
3条回答
  • 2021-02-09 10:36

    I've scoured internet for the exact same problem you are having and tried various suggestions, including those above. However, none helped so I did it the old fashioned way. I went on elasticsearch documentation on filebeat configuration

    and all that was required (no need for filters config in logstash)

    Filebeat config:

    filebeat.prospectors:
    - input_type: log
      document_type: #whatever your type is, this is optional
      json.keys_under_root: true
      paths:
        - #your path goes here
    

    keys_under_root

    copies nested json keys to top level in the output document.

    My filebeat version is 5.2.2.

    0 讨论(0)
  • 2021-02-09 10:55

    From FileBeat 5.x You can do it without using Logstash.

    Filebeat config:

    filebeat.prospectors:
    - input_type: log
      paths: ["YOUR_LOG_FILE_DIR/*"]
      json.message_key: logId
      json.keys_under_root: true
    
    output.elasticsearch:
      hosts: ["<HOSTNAME:PORT>"]
      template.name: filebeat
      template.path: filebeat.template.json
    

    Filebeat is more lightweight then Logstash. Also, even if you need to insert to elasticsearch version 2.x you can use this feature of FileBeat 5.x Real example can be found here

    0 讨论(0)
  • 2021-02-09 11:02

    To parse JSON log lines in Logstash that were sent from Filebeat you need to use a json filter instead of a codec. This is because Filebeat sends its data as JSON and the contents of your log line are contained in the message field.

    Logstash config:

    input {
      beats {
        port => 5044
      }   
    }   
    
    filter {
      if [tags][json] {
        json {
          source => "message"
        }   
      }   
    }   
    
    output {
      stdout { codec => rubydebug { metadata => true } } 
    }
    

    Filebeat config:

    filebeat:
      prospectors:
        - paths:
            - my_json.log
          fields_under_root: true
          fields:
            tags: ['json']
    output:
      logstash:
        hosts: ['localhost:5044']
    

    In the Filebeat config, I added a "json" tag to the event so that the json filter can be conditionally applied to the data.

    Filebeat 5.0 is able to parse the JSON without the use of Logstash, but it is still an alpha release at the moment. This blog post titled Structured logging with Filebeat demonstrates how to parse JSON with Filebeat 5.0.

    0 讨论(0)
提交回复
热议问题