Why we need a coarse quantizer?

Deadly 提交于 2019-12-08 07:29:24

问题


In Product Quantization for Nearest Neighbor Search, when it comes to section IV.A, it says they they will use a coarse quantizer too (which they way I feel it, is just a really smaller product quantizer, smaller w.r.t. k, the number of centroids).

I don't really get why this helps the search procedure and the cause might be that I think I don't get the way they use it. Any ides please?


回答1:


As mentioned in the NON EXHAUSTIVE SEARCH section,

Approximate nearest neighbor search with product quantizers is fast and reduces significantly the memory requirements for storing the descriptors.

Nevertheless, the search is exhaustive.

The coarse quantizer is for non-exhaustive search. It retrieves a candidate set first, then searches within the candidate set for nearest neighbors based on PQ.

Thus IMO the performance depends largely on the performance of the coarse quantizer. If the candidate set does not contain some the true nearest neighbors in the first place, we can not get them in the subsequent PQ step either.

And afaik the coarse quantizer thing is one of the basic algorithms for ANN, it doesn't have to be used together with PQ.



来源:https://stackoverflow.com/questions/38388748/why-we-need-a-coarse-quantizer

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!