I have Logstash and Elasticsearch installed locally on my Windows 7 machine. I installed logstash-input-jdbc in Logstash.
I have data in MySql database which I send
By default, the logstash-input-jdbc plugin will run your SELECT statement once and then quit. You can change this behavior by adding a schedule parameter with a cron expression to your configuration, like this:
input {
jdbc {
jdbc_driver_library => "C:/logstash/lib/mysql-connector-java-5.1.37-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://127.0.0.1:3306/test"
jdbc_user => "root"
jdbc_password => ""
statement => "SELECT * FROM transport.audit"
schedule => "* * * * *" <----- add this line
jdbc_paging_enabled => "true"
jdbc_page_size => "50000"
}
}
The result is that the SELECT statement will now run every minute.
If you had a date field in your MySQL table (but it doesn't seem the case), you could also use the pre-defined sql_last_start
parameter in order to not re-index all records on every run. That parameter can be used in your query like this:
statement => "SELECT * FROM transport.audit WHERE your_date_field >= :sql_last_start"