Given a set of intervals [x,y] where 0 <= x,y <= 2000
how to find minimum number of points which can cover(i.e. Every interval should contain at least o
You can use a greedy algorithm:
Sort all intervals by their end points(in increasing order).
Iterate over a sorted array of intervals. When an interval is over, there are two options:
The resulting set generated by this algorithm is optimal.