可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试):
问题:
I setup elasticsearch and Kibana for indexing our application (error) logs. The issue is that Kibana doesn't display any data in the "Discover" tab.
Current situation
- Elasticsearch is up and running, responds to API
- executing a query directly on Elasticsearch like
http://elasticserver.com:9200/applogs/_search?q=*
returns lots of results (see below on how a single found record looks like) - Kibana is up and running, even finds the
applogs
index exposed by Elasticsearch - Kibana also shows the correct properties and data type of the
applogs
documents - "Discover" tab doesn't show any results...even when setting the time period to a couple of years...
Any ideas??
Here's how Kibana sees the applogs
index:
Elastic search query result object looks like this:
{ _index: "applogs", _type: "1", _id: "AUxv8uxX6xaLDVAP5Zud", _score: 1, _source: { appUid: "esb.Idman_v4.getPerson", level: "trace", message: "WS stopwatch is at 111ms.", detail: "", url: "", user: "bla bla bla", additionalInfo: "some more info", timestamp: "2015-03-31T15:08:49" } },
..and what I see in the discover tab:
回答1:
For people who has problem like this:
Change time frame in top right corner.
By default it shows data only for last 15 min.
回答2:
I wanted to put this as a comment but unfortunately, I am not able to given my deficient repo to do so. So as @Ngeunpo suggested, this is how you add a time field to an index while creating it:
. If you did not do that while creating your index, I suggest you delete that index and recreate it. The index name
logstash-* in the gif is analogous to your index
applogs. In this case, field
@timestamp is added as the time field. Let me know if this works.
EDIT: Image courtesy: This wonderful ELK setup guide
回答3:
Kibana does not understand the timestamp field, if it's format is incorrect.Timestamp, which you selected by clicking on Time-field name when Configure an index pattern, need to be :
"timestamp":"2015-08-05 07:40:20.123"
then you should update your index mapping like this:
curl -XPUT 'http://localhost:9200/applogs/1/_mapping' -d' { "1": { "timestamp": { "enabled": true, "type": "date", "format": "yyyy-MM-dd HH:mm:ss.SSS", "store": true } } }'
See this question and answer
UPDATE
If you are using ES 2.X, you can set the "format"
to "epoch_millis"
like this:
curl -XPUT 'http://localhost:9200/applogs/1/_mapping' -d' { "1": { "timestamp": { "type": "date", "format": "epoch_millis", "store": true, "doc_values": true } } }'
回答4:
sample image for ELK creating index or pattern
Try this: unclick "Index contains time-based events" checkbox and then provide your index name then check "Discover" whether it contains data or not
回答5:
I had same issue and this worked for me:
- Delete the index from the Settings tab.
- restart Kibana
- then re-add in Settings
The issues with Time-Series I'm sure can also be an issue, but if no fields actually show up in the Discover tab, then you might have same issue as original reporter and what I had.
回答6:
I had the same issue, and @tAn-'s comment helped me to resolve it. Changing date field to @timestamp did the trick. Thanx!
The next step should be to find out that was wrong with my custom date field.
回答7:
I had the same problem, but now its working fine. The problem was with the @timestamp. Actually I have uploaded the file to elasticsearch
using logstash
thus it automatically generate a @timestamp field. Kibana
compare time range with this @timestamp,that is, when the actual event occurred.Even if I deselect "Index contains time-based events" option in add new index pattern page, kibana
will automatically consider the @timestamp field.So toggle with timeframe on kibana
based on the @timestamp field worked for me. You can also check by adding index pattern with out a time stamp and deselect "Index contains time-based events" option.See what happens ..now there wont be any time frame select option in kibana
discover page and you will most probably get the result in discover page. These are all my observations, not sure, this solution fits your case ..you may try.. I am using ES 1.5x, logstash 1.5.1 and kibana 4.1.0
回答8:
I also experienced the same error. Mostly this happens because of time format. Basically, make sure you have valid time frame for your data (top-right filter). Anyway,in my case, I used epoch time format for timestamp but it didn't work. So I changed to epoch_millisec instead and it worked like a charm.
In sum, make sure that Kibana can understand your date time format. It is required epoch_millisec by default not just epoch.
回答9:
I had probably the same issue - I see data in the dashboard but 0 results in discover. Going to Managerment > Index Pattern > Refresh filed list button (a button with refresh icon only) solved it for me.
回答10:
Not relevant to OP, but if your Kibana instance crashes while you try and get it to recognise your index (under Settings > Indices), see https://github.com/elastic/kibana/issues/6419 which fixed the problem for me too.
回答11:
I had the same issue, So, as shown in one of the solutions above, I went to settings and deleted the previous index and made a new with @timestamp.
But that didnt solve the issue. So, I looked into the issue and saw, after a deployment, there was nothing coming into Kibana.
So, I went into the server, and saw that the indexes were corrupted. SO I just stopped the logstash and elasticsearch on the instance/server and restarted the service. And Voila , It successfully restarted the service and kibana was back.
WHY DID IT HAPPEN ?
Someone might have stopped the server abruptly which caused indexes to get corrupted.
回答12:
In my situation, everything was working previously and then I couldn't see the latest data starting February 1st (actually, I could if I looked back a month). It turns out that the mapping format for my custom timefield was incorrect. My mapping format was YYYY-MM-DD'T'HH:mm:ss.SSSZ
. The problem was that DD
is interpreted as day of the year and I wanted day of the month which is dd
. Changing the mapping and reindexing fixed the problem.