logstash-jdbc

aggregate multiple recursive logstash

谁都会走 提交于 2020-07-10 10:28:15
问题 I am using logstash with input jdbc, and would like to embed one object inside another with aggregate. How can I use add recursive? Ie add an object inside another object? This would be an example: { "_index": "my-index", "_type": "test", "_id": "1", "_version": 1, "_score": 1, "_source": { "id": "1", "properties": { "nested_1": [ { "A": 0, "B": "true", "C": "PEREZ, MATIAS ROGELIO Y/O", "Nested_2": [ { "Z1": "true", "Z2": "99999" } }, { "A": 0, "B": "true", "C": "SALVADOR MATIAS ROMERO",

Logstash exception while fetching incremental data from SQL Server

点点圈 提交于 2020-02-06 18:06:29
问题 I am using LogStash 7.3.2 to fetch incremental data from SQL Server using this query: select * from mytable where lastupdatetimestamp > :sql_last_value I also have specified last_run_metadata_path in logstash config file. It works fine but sometimes it is throwing an exception:- Exception when executing JDBC query {:exception=># transition (daylight savings time 'gap'): 1942-09-01T00:00:00.000 (Asia/Kolkata)>} Why am I getting this exception and due to this exception it does not save last

Logstash exception while fetching incremental data from SQL Server

笑着哭i 提交于 2020-02-06 18:05:45
问题 I am using LogStash 7.3.2 to fetch incremental data from SQL Server using this query: select * from mytable where lastupdatetimestamp > :sql_last_value I also have specified last_run_metadata_path in logstash config file. It works fine but sometimes it is throwing an exception:- Exception when executing JDBC query {:exception=># transition (daylight savings time 'gap'): 1942-09-01T00:00:00.000 (Asia/Kolkata)>} Why am I getting this exception and due to this exception it does not save last

Logstash, mongodb and jdbc

萝らか妹 提交于 2020-01-25 23:53:27
问题 I have a problem configuring logstash. I want to be able to put in input jdbc for mongodb. My config : input{ jdbc{ jdbc_driver_library => "mongo-java-driver-3.2.2.jar" jdbc_driver_class => "com.mongodb.MongoClient" jdbc_connection_string => "jdbc:mongodb://localhost:27017" jdbc_user => "" } } output{ stdout{ } } The problem is : :error=>"Java::JavaSql::SQLException: No suitable driver found for jdbc:mongodb://localhost:27017/"} 回答1: More inputs would be good. you must specify the location of

Why is logstash throwing error of daylight saving time gap with SQL Server data

这一生的挚爱 提交于 2019-12-20 06:09:23
问题 We are using LogStash version 7.3.2 to fetch SQL Server data. And it is working fine but sometimes it is throwing below exception: Exception when executing JDBC query {:exception=># transition (daylight savings time 'gap'): 1942-09-01T00:00:00.000 (Asia/Kolkata)>} When I check in SQL server then there is no value like 1942-09-01T00:00:00.000. My LogStash config is as below: jdbc_connection_string => "jdbc:sqlserver://HOST:PORT;databaseName=DB_NAME;integratedSecurity=false jdbc_user =>

Logstash jdbc left outer join as subdocuments

余生长醉 提交于 2019-12-12 02:59:32
问题 I'm using the Logstash jdbc plugin to get MySQL data into ElasticSearch. Due to a left outer join I end up with multiple 'child rows' for a single 'parent row'. Say 1 user has 1 or more documents. I tried to group_concat the text of the documents and then group by by the user id to retain 1 row per user. However, MySQL's group_concat has a length limit of 1024... Does anyone know a solution to overcome the group_concat altogether and deal with left outer joins as nested documents? Thanx 回答1:

Delete old documents from Elastic Search using logstash

这一生的挚爱 提交于 2019-12-11 09:38:00
问题 I am using logstash to index data from postgres(jdbc input plugin) into elasticsearch. I don't have any time based information in the database. Postgres table users to import has 2 columns - userid(unique), uname Elastic search export - _id = userid I am exporting this data every hour using cron schedule in logstash. input { jdbc { schedule => "0 */1 * * *" statement => "SELECT userid, uname FROM users" } } output { elasticsearch { hosts => ["elastic_search_host"] index => "user_data"

Logstash error when converting MySQL value to nested elasticsearch property on suggestion field

时光怂恿深爱的人放手 提交于 2019-12-04 12:46:41
A Huge cry for help here, When i try to convert a MySQL value to a nested elasticsearch field using logstash i get the following error. {"exception"=>"expecting List or Map, found class org.logstash.bivalues.StringBiValue", "backtrace"=>["org.logstash.Accessors.newCollectionException(Accessors.java:195)" Using the following config file: input { jdbc { jdbc_driver_library => "/logstash/mysql-connector-java-5.1.42-bin.jar" jdbc_driver_class => "com.mysql.jdbc.Driver" jdbc_connection_string => "jdbc:mysql://localhost:3306/data" jdbc_user => "username" jdbc_password => "password" statement =>