ElasticSearch

Add additional attributes to an existing document elasticsearch

廉价感情. 提交于 2021-02-08 12:33:30
问题 How do I add additional attributes to an existing document in Elasticsearch index. $ curl -XPUT 'http://localhost:9200/twitter/tweet/1' -d '{ "user" : "kimchy", "post_date" : "2009-11-15T14:12:12", "message" : "trying out Elastic Search" }' This would create a document in the index. How do I add an attribute to the document? Suppose "new_attribute":"new_value" which would modify the document as "user" : "kimchy", "post_date" : "2009-11-15T14:12:12", "message" : "trying out Elastic Search"

Add additional attributes to an existing document elasticsearch

徘徊边缘 提交于 2021-02-08 12:32:07
问题 How do I add additional attributes to an existing document in Elasticsearch index. $ curl -XPUT 'http://localhost:9200/twitter/tweet/1' -d '{ "user" : "kimchy", "post_date" : "2009-11-15T14:12:12", "message" : "trying out Elastic Search" }' This would create a document in the index. How do I add an attribute to the document? Suppose "new_attribute":"new_value" which would modify the document as "user" : "kimchy", "post_date" : "2009-11-15T14:12:12", "message" : "trying out Elastic Search"

Return sets of keywords derived from fields in ElasticSearch

跟風遠走 提交于 2021-02-08 11:57:30
问题 Im kinda new to this i need help, i looked online couldnt find any answer im looking for. Basically, what im trying to do is for autocomplete based on keywords derived from some textfields Given an example of my indices: "name": "One liter of Chocolate Milk" "name": "Milo Milk 250g" "name": "HiLow low fat milk" "name": "Yoghurt strawberry" "name": "Milk Nutrisoy" So when i type in "mi", im expecting to get the results like: "milk" "milo" "milo milk" "chocolate milk" etc Very good example is

Logstash pipeline not working with csvfile

家住魔仙堡 提交于 2021-02-08 11:33:50
问题 set it up like below wget https://artifacts.elastic.co/downloads/logstash/logstash-6.6.2.deb sudo dpkg -i logstash-6.6.2.deb sudo systemctl enable logstash.service sudo systemctl start logstash.service and i added a pipeline script like below input { file { path => "/root/dev/Intuseer-PaaS/backend/airound_sv_logs.log" start_position => "beginning" } } output { stdout {} file { path => "/root/dev/output/output-%{+YYYY-MM-dd}.log" } } the log file likes below timestamp, server_cpu, server

Logstash pipeline not working with csvfile

六眼飞鱼酱① 提交于 2021-02-08 11:33:14
问题 set it up like below wget https://artifacts.elastic.co/downloads/logstash/logstash-6.6.2.deb sudo dpkg -i logstash-6.6.2.deb sudo systemctl enable logstash.service sudo systemctl start logstash.service and i added a pipeline script like below input { file { path => "/root/dev/Intuseer-PaaS/backend/airound_sv_logs.log" start_position => "beginning" } } output { stdout {} file { path => "/root/dev/output/output-%{+YYYY-MM-dd}.log" } } the log file likes below timestamp, server_cpu, server

Average of difference between the dates

我们两清 提交于 2021-02-08 11:29:43
问题 A snippet of my elasticsearch data is like below. Status field is nested. status: [ { "updated_at": "2020-08-04 17:18:41", "created_at": "2020-08-04 17:18:39", "sub_stage": "Stage1" }, { "updated_at": "2020-08-04 17:21:15", "created_at": "2020-08-04 17:18:41", "sub_stage": "Stage2" }, { "updated_at": "2020-08-04 17:21:15", "created_at": "2020-08-04 17:21:07", "sub_stage": "Stage3" } ] After aggregating based on some field, I have for each bucket some documents and every document will have

Not able to insert JSON from PostgreSQL to elasticsearch. Getting error - “Exception when executing JDBC query”

半城伤御伤魂 提交于 2021-02-08 10:44:22
问题 I am trying to migrate data from postgresql server to elasticsearch. The postgres data is in JSONB format. When I am starting the river, I am getting the below error. [INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600} [2019-01-07T14:22:34,625][INFO ][logstash.inputs.jdbc ] (0.128981s) SELECT to_json(details) from inventory.retailer_products1 limit 1 [2019-01-07T14:22:35,099][WARN ][logstash.inputs.jdbc ] Exception when executing JDBC query {:exception=>#<Sequel:

Find closest GeoJSON polygon to point when point lies outside of all polygons in Elasticsearch

邮差的信 提交于 2021-02-08 09:28:14
问题 Using Elasticsearch, say I have documents of GeoJSON polygons. I'm using the percolator (ala How to know if a geo coordinate lies within a geo polygon in elasticsearch?) to find if a point lies within a polygon. That works great! Is there any way in Elasticsearch to determine If a point lies outside of all polygons What the closest polygon that to that point? Return that polygon 来源: https://stackoverflow.com/questions/41558123/find-closest-geojson-polygon-to-point-when-point-lies-outside-of

Elastic Search Dynamic fields in hierarchy of JSON

萝らか妹 提交于 2021-02-08 08:24:42
问题 I'm going to have a JSON document where there would be nesting. Example is: { "userid_int" : <integer> "shoes" : { "size_int" : <integer>, "brand_str" : <string>, "addeddate_dt" : <date> }, "shirt" : { "size_int" : <integer>, "brand_str" : <string>, "addeddate_dt" : <date> "color_str" : <string> }, ... } There is no limit on what nested fields could be. For example, I may want a new key "pyjamas" for a particular document. But this is unknown upfront while the index is being created. All I

elasticsearch changing path.logs and/or path.data - fails to start

百般思念 提交于 2021-02-08 06:47:48
问题 Here's my config # ----------------------------------- Paths ------------------------------------ # # Path to directory where to store the data (separate multiple locations by comma): # path.data: /mulelogs/elasticsearch path.logs: /mulelogs/elasticsearch When I restart ElasticSearch this is what I get: elasticsearch.service - Elasticsearch Loaded: loaded (/usr/lib/systemd/system/elasticsearch.service; enabled; vendor preset: disabled) Active: failed (Result: exit-code) since Mon 2016-01-25