logstash

Failed to execute action :action=>LogStash::PipelineAction::Create/pipeline_id:main

喜你入骨 提交于 2021-01-29 03:51:00
问题 I have installed ELK stack version 7.0.0 on my CentOS7 VM and I faced with an issue during Logstash service start: [ERROR] 2019-05-13 08:21:37.359 [Converge PipelineAction::Create] agent - Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"MultiJson::ParseError", :message=>"JrJackson::ParseError", :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/multi_json-1.13.1/lib/multi_json/adapter.rb:20:in load'", "/usr/share/logstash/vendor

Kafka Consumer Failed to load SSL keystore (Logstash ArcSight module) for any keystore type and path

柔情痞子 提交于 2021-01-29 02:17:25
问题 I need to supply a certificate for client authentication for Kafka Consumer, however, it always fails with the following exception ( Failed to load SSL keystore ): ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = https ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = /usr/lib/jvm/java-8-openjdk-amd64/jre/lib/security/cacerts ssl.keystore.password = [hidden] ssl.keystore.type = JKS ssl.protocol

How to send data from HTTP input to ElasticSearch using Logstash ans jdbc_streaming filter?

耗尽温柔 提交于 2021-01-29 00:56:23
问题 I want to send data from Http to elasticsearch using logstash and I want to enrich my data using jdbc_streaming filter plugin. This is my logstash config: input { http { id => "sensor_data_http_input" user => "sensor_data" password => "sensor_data" } } filter { jdbc_streaming { jdbc_driver_library => "E:\ElasticStack\mysql-connector-java-8.0.18\mysql-connector-java-8.0.18.jar" jdbc_driver_class => "com.mysql.jdbc.Driver" jdbc_connection_string => "jdbc:mysql://localhost:3306/sensor_metadata"

How to send data from HTTP input to ElasticSearch using Logstash ans jdbc_streaming filter?

雨燕双飞 提交于 2021-01-29 00:55:32
问题 I want to send data from Http to elasticsearch using logstash and I want to enrich my data using jdbc_streaming filter plugin. This is my logstash config: input { http { id => "sensor_data_http_input" user => "sensor_data" password => "sensor_data" } } filter { jdbc_streaming { jdbc_driver_library => "E:\ElasticStack\mysql-connector-java-8.0.18\mysql-connector-java-8.0.18.jar" jdbc_driver_class => "com.mysql.jdbc.Driver" jdbc_connection_string => "jdbc:mysql://localhost:3306/sensor_metadata"

Logstash exception Expected one of #, input, filter, output at line 1, column 1

主宰稳场 提交于 2021-01-28 19:02:15
问题 When I insert the new data in my database (MySQL), Logstash doesn't add it dynamically : Below you will see logstash.conf (it's the file who connected elasticsearch with mysql) input { jdbc { jdbc_connection_string => "jdbc:mysql://localhost:3306/blog" #Accès à la base de données jdbc_user => "root" jdbc_password => "" jdbc_driver_library => "C:\Users\saidb\Downloads\mysql-connector-java-5.1.47\mysql-connector-java-5.1.47.jar" jdbc_driver_class => "com.mysql.jdbc.Driver" schedule => "* * * *

How to import StructuredArgument for structured logging in scala using slf4j and logback

旧时模样 提交于 2021-01-28 09:56:44
问题 This is probably a stupid question, but my scala knowledge is a bit lacking. I'm trying to implement structured logging in scala, and we're using slf4j/logback/logstash. I came across the following post: How does SLF4J support structured logging Which describes how to do it: import static net.logstash.logback.argument.StructuredArguments.*; /* * Add "name":"value" ONLY to the JSON output. * * Since there is no parameter for the argument, * the formatted message will NOT contain the key/value.

How to import StructuredArgument for structured logging in scala using slf4j and logback

喜你入骨 提交于 2021-01-28 09:53:04
问题 This is probably a stupid question, but my scala knowledge is a bit lacking. I'm trying to implement structured logging in scala, and we're using slf4j/logback/logstash. I came across the following post: How does SLF4J support structured logging Which describes how to do it: import static net.logstash.logback.argument.StructuredArguments.*; /* * Add "name":"value" ONLY to the JSON output. * * Since there is no parameter for the argument, * the formatted message will NOT contain the key/value.

python-logstash not working

有些话、适合烂在心里 提交于 2021-01-27 12:46:02
问题 I have an elasticsearch cluster (ELK) and some nodes sending logs to the logstash using filebeat. Lately I added a new application server, who is sending logs to my logstash using python-logstash . My logstash input configuration looks something like this : input { beats { type => beats port => 5044 } udp { port => 5044 } } My application server sends the logs successfully to the logstash. On my logstash machine I tried to run the following command: tcpdump -nn | grep x.x.x.x x.x.x.x is the

ELK系列(二):.net core中使用ELK

天涯浪子 提交于 2021-01-26 08:45:19
ELK安装好后,我们现在.net Core中使用一下,大体思路就是结合NLog日志组件将数据写入ELK中,其它语言同理。 ELK的安装还是有些复杂的,我们也可以在Docker中安装ELK:docker run -it --rm -p 9200: 9200 -p 5601: 5601 -- name esk nshou/elasticsearch-kibana 这条命令执行完成后,我们就在本地运行了elasticsearch和Kibana,没有错误的话我们就可以通过localhost:5601直接访问Kibana界面了: 这里我们可以看到在一个容器里运行了多个程序,这样节省了资源,同样增加了管理的复杂性,不建议在生产环境中这样使用。 同样我们也可以通过localhost:9200访问elasticsearch,返回如下数据: 有了elasticsearch和kibana我们还需要logstash,我这里以阿里云上安装的logstash为例,首先进到目录下,我们需要新增一个nlog.conf配置文件: 内容如下: 这里使用最简单的配置(其实是复杂的配置我一时还没看懂。。。),这里我们指定监听端口8001,同时指定数据输出到elasticsearch中,下面是它的IP和端口。 添加完配置文件后在logstash文件夹下通过:bin/logstash -f nlog.conf 运行当前配置

使用logstash-jdbc-input插件实现mongodb数据实时同步到elasticsearch

坚强是说给别人听的谎言 提交于 2021-01-24 21:00:04
一、实验介绍 logstash-jdbc-input 是Logstash提供的官方插件之一,该插件通过JDBC接口将任何数据库中的数据导入 Logstash。关于使用 logstash-jdbc-input 插件从数据库中导出数据到es上,大部分是关于mysql数据库的导入。本篇文章是关于如何使用 logstash-jdbc-input 插件对mongodb的数据进行实时导入。 二、版本说明 本实验使用的ELK版本是7.6.2。 (这里想要补充一下,关于mongodb数据库的数据导入,另外一种常使用的插件是 mongo-connector ,但该插件仅支持到elasticsearch5.x,因此对于更高版本的elasticsearch更推荐使用本篇文章使用的方法。) 三、具体实现 1. 下载相关的jdbc-driver文件并解压 下载地址: https://dbschema.com/jdbc-drivers/MongoDbJdbcDriver.zip 解压安装包: unzip MongoDbJdbcDriver.zip (安装包里面包括三个 jar 包文件: gson-2.8.6.jar 、 mongo-java-driver-3.12.4.jar 、 mongojdbc2.1.jar ) 将所有文件(即三个jar包)复制到 (~/logstash-7.6.2/logstash