Extract from ElasticSearch, into Kafka, continuously any new ES updates using logstash
问题 I have an ES cluster with multiple indices that all receive updates in random time intervals. I have a logstash instance extracting data from ES and passing it into Kafka. What would be a good method to run this every minute and pickup any updates in ES? Conf: input { elasticsearch { hosts => [ "hostname1.com:5432", "hostname2.com" ] index => "myindex-*" query => "*" size => 10000 scroll => "5m" } } output { kafka { bootstrap-servers => "abc-kafka.com:1234" topic_id => "my.topic.test" } } I