How to fix exceeded limit of maxWarmingSearchers?

后端 未结 3 584
孤街浪徒
孤街浪徒 2021-01-30 23:13

Anyone know why and how to resolve this as I have a very busy updates and searches at the same time.

Error opening new searcher.

Exceeded limit of max

3条回答
  •  离开以前
    2021-01-30 23:37

    As per the Solr FAQ: What does "exceeded limit of maxWarmingSearchers=X" mean?

    If you encounter this error a lot, you can (in theory) increase the number in your maxWarmingSearchers, but that is risky to do unless you are confident you have the system resources (RAM, CPU, etc...) to do it safely. A more correct way to deal with the situation is to reduce how frequently you send commits.

    What this error means is that you are basically making commits too often and the internal cache can't keep up with the frequency you are saying "clear the cache and let me search with the new data". You need to decrease the frequency you are making commits. You can find more information about this problem here in Near Realtime Search Tuning but the basic idea is the more facets you use the greater the interval you will need between commits.

    One way I got around this was I stopped making manual commits (i.e. having my application submit data to solr and then executing a commit request) and turning on solr autocommit.

    Here's an example:

    
    
      10000 
      15000 
      false 
    
    

    You will have to figure out how large of interval you will need (i.e. maxTime) but in practice every time I add more faceted search to my application (or more indexes or what have you) I have to increase the interval.

    If you need more real time search than the frequency of these commits will allow you can look into Solr soft commits.

提交回复
热议问题