The question:
how can you limit the input data to achieve a better Big O complexity? Describe an algorithm for handling this limited data to find if there are an
Assume that sorting is our problem.
We know that sorting with only comparisons requires Ω(n*log(n)) time and we can do it in O(n*log(n))
by for example a merge sort.
However, if we limit n
to some constant, for example let n < 10^6
, then we can do it for any input in O(10^6 * log(10^6))
which is O(1)
in terms of Big-O
.
The bottom line is, if you want to measure a performance in terms of the Big-O
notation, you can not assume any size limitation on the input.