kibana-4

Vertical bar chart in Kibana

ぃ、小莉子 提交于 2019-12-12 05:38:57
问题 I have set up ELK stack and following type of JSON is getting stored in elasticsearch(following JSON is copied from Kibana UI). Now I want to display Vertical bar chart which will have Top 5 "hostname" when "action" is equal to "passthrough" { "_index": "logstash-2016.06.16", "_type": "utm", "_id": "AVVaFcaB7mNsx5uOb1-_", "_score": null, "_source": { "message": "<190>date=2016-06-16 time=22:10:26 hostname=\"googleads.g.doubleclick.net\" profile=\"Software_Dept\" action=passthrough", "@version

Unable to run logstash config file (permission denied)

|▌冷眼眸甩不掉的悲伤 提交于 2019-12-12 03:38:40
问题 my config file is stored in /etc/logstash/ and I ran the command $ /etc/logstash -f /etc/logstash/logstash.conf as root. However, they told me that permission denied when I tried to do that. Is there any way to solve this? 回答1: As said, you need to run /opt/logstash/bin/logstash -f /etc/logstash/logstash.conf instead of /etc/logstash -f /etc/logstash/logstash.conf . This is caused by the default directory structure of your Linux system which logstash uses to put its files in. Wikipedia:

Kibana, filter data on the basis of one field and then grouping values on basis of timestamp (yearly)

一个人想着一个人 提交于 2019-12-12 03:30:01
问题 I have three fields in my data: tran_date , use_case , Amount . use_case field having multiple values i.e. B2B , cash_in , C2C etc. I want to plot a bar chart by sum up Amount field against use_case ( B2B , cash_in only) then want to group data on yearly basis . so there will be a bar for each year summing up the Amount against use case B2B , cash_in . I explored filter and sub-buckets but they don't seem to provide grouping of values. 回答1: "sub buckets" create aggregations, which are

Not getting each error email alert from logstash 1.5.4

孤街醉人 提交于 2019-12-12 02:49:56
问题 I have my ELK setup like below: HOST1: Component(which generates log) + Logstash (To send logs to redis) HOST2: Redis + Elasticsearch + Logstash ( To parse data based on grok and send it to elasticsearch on same setup) HOST3: Redis + Elasticsearch + Logstash ( To parse data based on grok and send it to elasticsearch on same setup) HOST4: nginx + Kibana 4 Now when I send one error log line from logstash to redis, I get double entry in Kibana 4. Like below: Plus I didnt get any email alert from

Objects in array is not well supported error observed for ELK docker image

隐身守侯 提交于 2019-12-11 18:48:20
问题 I'm using the latest elk image for kibana dashboard and I have json file which is having list of array[] and I'm not able to show those as field in kibana and It's showing that the object in array is not well supported error message. As per the document in kibana I just went through the below link but I didn't find anything useful for elk docker image. https://github.com/istresearch/kibana-object-format I just tried to run the command Run bin/kibana-plugin install <package.zip> but it

ElasticSearch + Kibana to display business data

这一生的挚爱 提交于 2019-12-11 16:50:03
问题 So I have visitor data captured for the past several years - over 14 million records. On top of that I have form data from the past several years. There is a common ID between the two. Right now I'm attempting to learn ElasticSearch + Kibana using the visitor data. The data is fairly simple but not real well formatted - PHP's $_REQUEST and $_SERVER data. Here's an example from a Google bot visit: {u'Entrance Time': 1407551587.7385, u'domain': u'############', u'pages': {u

Elasticsearch 2.1: Cannot install Marvel into Kibana

泪湿孤枕 提交于 2019-12-11 13:30:25
问题 I am a newbie to the ES world and just trying to get my local environment set up. I am using a Mac and I used Homebrew to install Elasticsearch and Kibana. Now I want to add Marvel to Kibana but it fails with the following error: user :/usr/local/opt/kibana/(master)$ bin/kibana plugin --install elasticsearch/marvel/latest Installing marvel Attempting to extract from https://download.elastic.co/elasticsearch/marvel/marvel-latest.tar.gz Downloading 3843924 bytes.................... Extraction

Scripting dynamic Elasticsearch queries inside Kibana visualization?

允我心安 提交于 2019-12-11 11:29:39
问题 Hi I'm new to the ELK stack. I'm using Kibana 4.1. I've managed to use the Elasticsearch Query DSL to run searches within Kibana's Discover interface to capture a data set, then used that saved search to create a new Visualization and Dashboard widget in Kibana. My Elasticsearch query looks like { "bool" : { "must" : [ { "match" : { "service" : "servicename" } }, { "match_phrase" : { "msg" : "Trying to get security token for user: joe" } } ], "minimum_should_match" : 1, "boost" : 1.0 } }

ElasticSearch + Kibana - Unique count using pre-computed hashes

我的未来我决定 提交于 2019-12-11 09:40:09
问题 update : Added I want to perform unique count on my ElasticSearch cluster. The cluster contains about 50 millions of records. I've tried the following methods: First method Mentioned in this section: Pre-computing hashes is usually only useful on very large and/or high-cardinality fields as it saves CPU and memory. Second method Mentioned in this section: Unless you configure Elasticsearch to use doc_values as the field data format, the use of aggregations and facets is very demanding on heap

How to load Kibana3 dashboards into Kibana4?

青春壹個敷衍的年華 提交于 2019-12-11 07:09:34
问题 I have recently installed Kibana4 but I am beginning to understand that dashboards are designed differently from Kibana3 i.e., to embed multiple visualizations which are designed individually into every dashboard. I already have a lot of dashboards designed in Kibana3 so I would like to know if there is a way to load them to kibana4 instead of creating everything from scratch. 回答1: To the best I know, there is no way to do that. Not just the formats, but the queries sent to ES backend are