Constants in the formal definition of Big O

前端 未结 2 1437
走了就别回头了
走了就别回头了 2021-01-05 02:10

I\'m revising the formal definitions of Big O and the other associated bounds and something is tripping me up. In the book I\'m reading (Skiena) Big O is defined as:

相关标签:
2条回答
  • 2021-01-05 02:28

    You can multiply g(n) by an arbitrary constant c is because you want functions that are only a constant c factor away from f(n). In simple terms you perform your analysis based on n, not constants so what you are concerned with is how those functions change depending on the input size only. For instance when you have n^3 and n there's no way you can choose a c where c*n >= n^3 unless c >= n^2 which isn't constant anymore so g(n) will be running away from f(n) with n.

    As Ed mentioned this analysis won't give you an exact run time but a growth rate depending on input n. If g(n) and f(n) are always only (at most) a constant factor away from each other than the growth rate will be the same for both.

    In this kind of time complexity analysis we don't really care about constant which in most cases is ok but in some cases you actually should take it into account. For instance if you are working on small sets an O(n^2) algorithm might actually be faster than O(nlogn) because of constants.

    Second question: yes this is a common problem with BigO, you can use an arbitrary function that's why we usually are trying to find the "tightest" g(n) we can, otherwise there's no much point in finding it. That's also why *BigTheta is much more useful than BigO as it tells you a tight bound, instead of an upper bound.

    0 讨论(0)
  • 2021-01-05 02:45

    When choosing to classify an algorithm into a complexity class, is the general rule of thumb to just choose the lowest growth class that still holds according to the definition of Big O?

    In terms of notations, just like we have big-O for upper bounds we have big-Omega for lower bounds and big-Theta for when you are able to show that the upper bound and the lower bounds match.

    https://en.wikipedia.org/wiki/Big_O_notation#The_Knuth_definition

    Assuming that Knuth quote is correct, then we can say that you are not alone in assuming that results involving tight asymptotic bounds are more useful :) Sometimes people say big-O when they actually meant to say big-Theta but some other times they just don't care or haven't managed to find the lower bound.

    It seem like I could choose a very large value for c, and make the whole thing arbitrary by blowing out the size of smaller g(n) values.

    For functions with different asymptotic growth rates, the c doesn't matter. No matter how big or how small you choose c to be, there will be an n when the faster growing function catches up. The constant factor is there to allow you to ignore constant multipliers when things have the same growth rate. For example, when it comes to big-O, f(x) = 2x and g(x) = 3x both have the same growth rate.

    0 讨论(0)
提交回复
热议问题