How to send data from HTTP input to ElasticSearch using Logstash ans jdbc_streaming filter?

雨燕双飞 提交于 2021-01-29 00:55:32

问题


I want to send data from Http to elasticsearch using logstash and I want to enrich my data using jdbc_streaming filter plugin. This is my logstash config:

input {
  http {
    id => "sensor_data_http_input"
    user => "sensor_data"
    password => "sensor_data"
  }
}

filter {
  jdbc_streaming {
    jdbc_driver_library => "E:\ElasticStack\mysql-connector-java-8.0.18\mysql-connector-java-8.0.18.jar"
    jdbc_driver_class => "com.mysql.jdbc.Driver"
    jdbc_connection_string => "jdbc:mysql://localhost:3306/sensor_metadata"
    jdbc_user => "elastic"
    jdbc_password => "hide"
    statement => "select st.sensor_type as sensorType, l.customer as customer, l.department as department, l.building_name as buildingName, l.room as room, l.floor as floor, l.location_on_floor as locationOnFloor, l.latitude, l.longitude from sensors s inner join sensor_type st on s.sensor_type_id=st.sensor_type_id inner join location l on s.location_id=l.location_id where s.sensor_id= :sensor_identifier"
    parameters => { "sensor_identifier" => "sensor_id"}
    target => lookupResult
  }
  mutate {
    rename => {"[lookupResult][0][sensorType]" => "sensorType"}
    rename => {"[lookupResult][0][customer]" => "customer"}
    rename => {"[lookupResult][0][department]" => "department"}
    rename => {"[lookupResult][0][buildingName]" => "buildingName"}
    rename => {"[lookupResult][0][room]" => "room"}
    rename => {"[lookupResult][0][floor]" => "floor"}
    rename => {"[lookupResult][0][locationOnFloor]" => "locationOnFloor"}

    add_field => {
            "location" => "%{lookupResult[0]latitude},%{lookupResult[0]longitude}"
        }

    remove_field => ["lookupResult", "headers", "host"]
  }
}

output {
  elasticsearch {
    hosts =>["localhost:9200"]
    index => "sensor_data-%{+YYYY.MM.dd}"
    user => "elastic"
    password => "hide"
  }
}

But when I start logstash, I see following error:

[2020-01-09T22:57:16,260]
[ERROR][logstash.javapipeline]
[main] Pipeline aborted due to error {
    :pipeline_id=>"main", 
    :exception=>#<TypeError: failed to coerce jdk.internal.loader.ClassLoaders$AppClassLoader to java.net.URLClassLoader>, 
    :backtrace=>[
        "org/jruby/java/addons/KernelJavaAddons.java:29:in `to_java'", 
        "E:/ElasticStack/Logstash/logstash-7.4.1/vendor/bundle/jruby/2.5.0/gems/logstash-filter-jdbc_streaming-1.0.7/lib/logstash/plugin_mixins/jdbc_streaming.rb:48:in `prepare_jdbc_connection'", 
        "E:/ElasticStack/Logstash/logstash-7.4.1/vendor/bundle/jruby/2.5.0/gems/logstash-filter-jdbc_streaming-1.0.7/lib/logstash/filters/jdbc_streaming.rb:200:in `prepare_connected_jdbc_cache'", 
        "E:/ElasticStack/Logstash/logstash-7.4.1/vendor/bundle/jruby/2.5.0/gems/logstash-filter-jdbc_streaming-1.0.7/lib/logstash/filters/jdbc_streaming.rb:116:in `register'", "org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:56:in `register'", 
        "E:/ElasticStack/Logstash/logstash-7.4.1/logstash-core/lib/logstash/java_pipeline.rb:195:in `block in register_plugins'", "org/jruby/RubyArray.java:1800:in `each'", 
        "E:/ElasticStack/Logstash/logstash-7.4.1/logstash-core/lib/logstash/java_pipeline.rb:194:in `register_plugins'", 
        "E:/ElasticStack/Logstash/logstash-7.4.1/logstash-core/lib/logstash/java_pipeline.rb:468:in `maybe_setup_out_plugins'", 
        "E:/ElasticStack/Logstash/logstash-7.4.1/logstash-core/lib/logstash/java_pipeline.rb:207:in `start_workers'", 
        "E:/ElasticStack/Logstash/logstash-7.4.1/logstash-core/lib/logstash/java_pipeline.rb:149:in `run'", 
        "E:/ElasticStack/Logstash/logstash-7.4.1/logstash-core/lib/logstash/java_pipeline.rb:108:in `block in start'"], 
    :thread=>"#<Thread:0x17fa8113 run>"
}
[2020-01-09T22:57:16,598]
[ERROR][logstash.agent] Failed to execute action {
    :id=>:main, 
    :action_type=>LogStash::ConvergeResult::FailedAction, 
    :message=>"Could not execute action: PipelineAction::Create<main>, action_result: false", 
    :backtrace=>nil
}

I am enriching my http input with some data in my mysql database but it doesn't start logstash at all.


回答1:


I see two potential problems, but you need to check which is really the issue here:

  1. MySql Driver class name has changed to com.mysql.cj.jdbc.Driver
  2. Maybe a classloader problem is occurring when you are using a recent jdbc driver outside the classloader path in combination with newer jdk versions. There are serveral issues around that at github. Put the driver in the logstash folder under <logstash-install-dir>/vendor/jar/jdbc/ (you need to create this folder first). If this don't work, move the driver under <logstash-install-dir>/logstash-core\lib\jars and don't provide any driver path in config file: jdbc_driver_library => ""



回答2:


Problem solved with remove jdbc_driver_library option entirely from the config file and also, as mentioned, set jdbc_driver_class to com.mysql.cj.jdbc.Driver.



来源:https://stackoverflow.com/questions/59698179/how-to-send-data-from-http-input-to-elasticsearch-using-logstash-ans-jdbc-stream

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!