I\'m a complete newbie to the ELK stack, so please excuse my ignorance. I\'ve been able to get Logstash to send data from my database to Elasticsearch, but it exits once it\'s d
You need to specify a schedule in your jdbc
input:
The schedule
below (* * * * *
) will run every minute and select records from your database and only select the records that have been updated after the last time the query ran. Your updated
timestamp field might be named differently, feel free to adjust to fit your case.
input {
jdbc {
jdbc_driver_library => "mysql-connector-java-5.1.36-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://localhost:3306/mydb"
jdbc_user => "mysql"
parameters => { "some_field" => "value" }
schedule => "* * * * *"
statement => "SELECT * from songs WHERE some_field = :some_field AND updated > :sql_last_value"
}
}