nearest neighbor - k-d tree - wikipedia proof

橙三吉。 提交于 2019-11-30 00:40:59
Andrew

Look carefully at the 6th frame of the animation on that page.

As the algorithm is going back up the recursion, it is possible that there is a closer point on the other side of the hyperplane that it's on. We've checked one half, but there could be an even closer point on the other half.

Well, it turns out we can sometimes make a simplification. If it's impossible for there to be a point on the other half closer than our current best (closest) point, then we can skip that hyperplane half entirely. This simplification is the one shown on the 6th frame.

Figuring out whether this simplification is possible is done by comparing the distance from the hyperplane to our search location. Because the hyperplane is aligned to the axes, the shortest line from it to any other point will a line along one dimension, so we can compare just the coordinate of the dimension that the hyperplane splits.

If it's farther from the search point to the hyperplane than from the search point to your current closest point, then there's no reason to search past that splitting coordinate.

Even if my explanation doesn't help, the graphic will. Good luck on your project!

Yes, the description of NN (Nearest Neighbour) search in a KD Tree on Wikipedia is a little hard to follow. It doesn't help that a lot of the top Google search results on NN KD Tree searches are just plain wrong!

Here's some C++ code to show you how to get it right:

template <class T, std::size_t N>
void KDTree<T,N>::nearest (
    const const KDNode<T,N> &node,
    const std::array<T, N> &point, // looking for closest node to this point
    const KDPoint<T,N> &closest,   // closest node (so far)
    double &minDist,
    const uint depth) const
{
    if (node->isLeaf()) {
        const double dist = distance(point, node->leaf->point);
        if (dist < minDist) {
            minDist = dist;
            closest = node->leaf;
        }
    } else {
        const T dim = depth % N;
        if (point[dim] < node->splitVal) {
            // search left first
            nearest(node->left, point, closest, minDist, depth + 1);
            if (point[dim] + minDist >= node->splitVal)
                nearest(node->right, point, closest, minDist, depth + 1);
        } else {
            // search right first
            nearest(node->right, point, closest, minDist, depth + 1);
            if (point[dim] - minDist <= node->splitVal)
                nearest(node->left, point, closest, minDist, depth + 1);
        }
    }
}

API for NN searching on a KD Tree:

// Nearest neighbour
template <class T, std::size_t N>
const KDPoint<T,N> KDTree<T,N>::nearest (const std::array<T, N> &point) const {
    const KDPoint<T,N> closest;
    double minDist = std::numeric_limits<double>::max();
    nearest(root, point, closest, minDist);
    return closest;
}

Default distance function:

template <class T, std::size_t N>
double distance (const std::array<T, N> &p1, const std::array<T, N> &p2) {
    double d = 0.0;
    for (uint i = 0; i < N; ++i) {
        d += pow(p1[i] - p2[i], 2.0);
    }
    return sqrt(d);
}

Edit: some people are asking for help with the data structures too (not just the NN algorithm), so here is what I have used. Depending on your purpose, you might wish to modify the data structures slightly. (Note: but you almost certainly do not want to modify the NN algorithm.)

KDPoint class:

template <class T, std::size_t N>
class KDPoint {
    public:
        KDPoint<T,N> (std::array<T,N> &&t) : point(std::move(t)) { };
        virtual ~KDPoint<T,N> () = default;
        std::array<T, N> point;
};

KDNode class:

template <class T, std::size_t N>
class KDNode
{
    public:
        KDNode () = delete;
        KDNode (const KDNode &) = delete;
        KDNode & operator = (const KDNode &) = delete;
        ~KDNode () = default;

        // branch node
        KDNode (const T                       split,
                std::unique_ptr<const KDNode> &lhs,
                std::unique_ptr<const KDNode> &rhs) : splitVal(split), left(std::move(lhs)), right(std::move(rhs)) { };
        // leaf node
        KDNode (std::shared_ptr<const KDPoint<T,N>> p) : splitVal(0), leaf(p) { };

        bool isLeaf (void) const { return static_cast<bool>(leaf); }

        // data members
        const T                                   splitVal;
        const std::unique_ptr<const KDNode<T,N>>  left, right;
        const std::shared_ptr<const KDPoint<T,N>> leaf;
};

KDTree class: (Note: you'll need to add a member function to build/fill your tree.)

template <class T, std::size_t N>
class KDTree {
    public:
        KDTree () = delete;
        KDTree (const KDTree &) = delete;
        KDTree (KDTree &&t) : root(std::move(const_cast<std::unique_ptr<const KDNode<T,N>>&>(t.root))) { };
        KDTree & operator = (const KDTree &) = delete;
        ~KDTree () = default;

        const KDPoint<T,N> nearest (const std::array<T, N> &point) const;

        // Nearest neighbour search - runs in O(log n)
        void nearest (const std::unique_ptr<const KDNode<T,N>> &node,
                      const std::array<T, N> &point,
                      std::shared_ptr<const KDPoint<T,N>> &closest,
                      double &minDist,
                      const uint depth = 0) const;

        // data members
        const std::unique_ptr<const KDNode<T,N>> root;
};
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!