Solr/Lucene fieldCache OutOfMemory error sorting on dynamic field

前端 未结 2 2047
逝去的感伤
逝去的感伤 2020-12-10 07:54

We have a Solr core that has about 250 TrieIntFields (declared as dynamicField). There are about 14M docs in our Solr index and many documents have

相关标签:
2条回答
  • 2020-12-10 07:58

    We have a way to rework the schema by keeping a single sort field. The dynamic fields we have are like relevance_CLASSID. The current schema has a unique key NODEID and a multi-valued field CLASSID - the relevance scores are for these class Ids. If we instead keep one document per classId per nodeId i.e. the new schema will have NODEID:CLASSID as unique key and store some redundant information across documents with the same NODEID, then we can sort on a single field relevance and do a filter query on CLASSID.

    0 讨论(0)
  • 2020-12-10 08:08

    I think you have two options:

    1) Add more memory.
    2) Force Solr not to use the field cache by specifying facet.method=enum, as per documentation.

    There's also a solr-user mailing list thread discussing the same problem.

    Unless your index is huge, I'd go with option 1). RAM is cheap these days.

    0 讨论(0)
提交回复
热议问题