问题
In Product Quantization for Nearest Neighbor Search, when it comes to section IV.A, it says they they will use a coarse quantizer too (which they way I feel it, is just a really smaller product quantizer, smaller w.r.t. k
, the number of centroids).
I don't really get why this helps the search procedure and the cause might be that I think I don't get the way they use it. Any ides please?
回答1:
As mentioned in the NON EXHAUSTIVE SEARCH section,
Approximate nearest neighbor search with product quantizers is fast and reduces significantly the memory requirements for storing the descriptors.
Nevertheless, the search is exhaustive.
The coarse quantizer is for non-exhaustive search. It retrieves a candidate set first, then searches within the candidate set for nearest neighbors based on PQ.
Thus IMO the performance depends largely on the performance of the coarse quantizer. If the candidate set does not contain some the true nearest neighbors in the first place, we can not get them in the subsequent PQ step either.
And afaik the coarse quantizer thing is one of the basic algorithms for ANN, it doesn't have to be used together with PQ.
来源:https://stackoverflow.com/questions/38388748/why-we-need-a-coarse-quantizer