How to make sure Solr/Lucene won't die with java.lang.OutOfMemoryError?

前端 未结 8 662
失恋的感觉
失恋的感觉 2021-02-04 06:31

I\'m really puzzled why it keeps dying with java.lang.OutOfMemoryError during indexing even though it has a few GBs of memory.

Is there a fundamental reason why it needs

8条回答
  •  一整个雨季
    2021-02-04 06:46

    a wild guess, the documents you are indexing are very large

    Lucene by default only indexes the first 10,000 terms of a document to avoid OutOfMemory errors, you can overcome this limit see setMaxFieldLength

    Also, you could call optimize() and close as soon as you are done with processing with Indexwriter()

    a definite way is to profile and find the bottleneck =]

提交回复
热议问题