How to fix exceeded limit of maxWarmingSearchers?

后端 未结 3 585
孤街浪徒
孤街浪徒 2021-01-30 23:13

Anyone know why and how to resolve this as I have a very busy updates and searches at the same time.

Error opening new searcher.

Exceeded limit of max

相关标签:
3条回答
  • 2021-01-30 23:33

    As it's well explained here you should reduce the number of commits you make, or change the value of maxWarmingSearchers in solrconfig.xml (which is not a good practice)

    0 讨论(0)
  • 2021-01-30 23:37

    As per the Solr FAQ: What does "exceeded limit of maxWarmingSearchers=X" mean?

    If you encounter this error a lot, you can (in theory) increase the number in your maxWarmingSearchers, but that is risky to do unless you are confident you have the system resources (RAM, CPU, etc...) to do it safely. A more correct way to deal with the situation is to reduce how frequently you send commits.

    What this error means is that you are basically making commits too often and the internal cache can't keep up with the frequency you are saying "clear the cache and let me search with the new data". You need to decrease the frequency you are making commits. You can find more information about this problem here in Near Realtime Search Tuning but the basic idea is the more facets you use the greater the interval you will need between commits.

    One way I got around this was I stopped making manual commits (i.e. having my application submit data to solr and then executing a commit request) and turning on solr autocommit.

    Here's an example:

    <!-- solrconfig.xml -->
    <autoCommit>
      <maxDocs>10000</maxDocs> <!-- maximum uncommited docs before autocommit triggered -->
      <maxTime>15000</maxTime> <!-- maximum time (in MS) after adding a doc before an autocommit is triggered -->
      <openSearcher>false</openSearcher> <!-- SOLR 4.0.  Optionally don't open a searcher on hard commit.  This is useful to minimize the size of transaction logs that keep track of uncommitted updates. -->
    </autoCommit>
    

    You will have to figure out how large of interval you will need (i.e. maxTime) but in practice every time I add more faceted search to my application (or more indexes or what have you) I have to increase the interval.

    If you need more real time search than the frequency of these commits will allow you can look into Solr soft commits.

    0 讨论(0)
  • As per the https://wiki.apache.org/solr/CommitWithin

    There are multiple commit strategies in Solr. The most known is explicit commits from the client. Then you have AutoCommit, configured in solrconfig.xml, which lets Solr automatically commit adds after a certain time or number of documents, and finally there may be a behind-the-scenes commit when the input buffer gets full.

    You can use 'CommitWithin' to handle this problem.
    Solr3.5 and later, it can be server.add(mySolrInputDocument, 10000);
    In earlier versions, you can use below code
    UpdateRequest req = new UpdateRequest(); req.add(mySolrInputDocument); req.setCommitWithin(10000); req.process(server);
    It can reduce the frequently you send the commits.

    0 讨论(0)
提交回复
热议问题