What does O(log n) mean exactly?

后端 未结 30 2396
执念已碎
执念已碎 2020-11-22 01:19

I am learning about Big O Notation running times and amortized times. I understand the notion of O(n) linear time, meaning that the size of the input affects the g

相关标签:
30条回答
  • 2020-11-22 01:39

    In information technology it means that:

      f(n)=O(g(n)) If there is suitable constant C and N0 independent on N, 
      such that
      for all N>N0  "C*g(n) > f(n) > 0" is true.
    

    Ant it seems that this notation was mostly have taken from mathematics.

    In this article there is a quote: D.E. Knuth, "BIG OMICRON AND BIG OMEGA AND BIG THETA", 1976:

    On the basis of the issues discussed here, I propose that members of SIGACT, and editors of computer science and mathematics journals, adopt notations as defined above, unless a better alternative can be found reasonably soon.

    Today is 2016, but we use it still today.


    In mathematical analysis it means that:

      lim (f(n)/g(n))=Constant; where n goes to +infinity
    

    But even in mathematical analysis sometimes this symbol was used in meaning "C*g(n) > f(n) > 0".

    As I know from university the symbol was intoduced by German mathematician Landau (1877-1938)

    0 讨论(0)
  • 2020-11-22 01:42

    I can give an example for a for loop and maybe once grasped the concept maybe it will be simpler to understand in different contexts.

    That means that in the loop the step grows exponentially. E.g.

    for (i=1; i<=n; i=i*2) {;}
    

    The complexity in O-notation of this program is O(log(n)). Let's try to loop through it by hand (n being somewhere between 512 and 1023 (excluding 1024):

    step: 1   2   3   4   5    6    7    8     9     10
       i: 1   2   4   8   16   32   64   128   256   512
    

    Although n is somewhere between 512 and 1023, only 10 iterations take place. This is because the step in the loop grows exponentially and thus takes only 10 iterations to reach the termination.

    The logarithm of x (to the base of a) is the reverse function of a^x.

    It is like saying that logarithm is the inverse of exponential.

    Now try to see it that way, if exponential grows very fast then logarithm grows (inversely) very slow.

    The difference between O(n) and O(log(n)) is huge, similar to the difference between O(n) and O(a^n) (a being a constant).

    0 讨论(0)
  • 2020-11-22 01:42

    The complete binary example is O(ln n) because the search looks like this:

    1 2 3 4 5 6 7 8 9 10 11 12
    

    Searching for 4 yields 3 hits: 6, 3 then 4. And log2 12 = 3, which is a good apporximate to how many hits where needed.

    0 讨论(0)
  • 2020-11-22 01:44

    Logarithmic running time (O(log n)) essentially means that the running time grows in proportion to the logarithm of the input size - as an example, if 10 items takes at most some amount of time x, and 100 items takes at most, say, 2x, and 10,000 items takes at most 4x, then it's looking like an O(log n) time complexity.

    0 讨论(0)
  • 2020-11-22 01:44

    I can add something interesting, that I read in book by Kormen and etc. a long time ago. Now, imagine a problem, where we have to find a solution in a problem space. This problem space should be finite.

    Now, if you can prove, that at every iteration of your algorithm you cut off a fraction of this space, that is no less than some limit, this means that your algorithm is running in O(logN) time.

    I should point out, that we are talking here about a relative fraction limit, not the absolute one. The binary search is a classical example. At each step we throw away 1/2 of the problem space. But binary search is not the only such example. Suppose, you proved somehow, that at each step you throw away at least 1/128 of problem space. That means, your program is still running at O(logN) time, although significantly slower than the binary search. This is a very good hint in analyzing of recursive algorithms. It often can be proved that at each step the recursion will not use several variants, and this leads to the cutoff of some fraction in problem space.

    0 讨论(0)
  • 2020-11-22 01:45

    First I recommend you to read following book;

    Algorithms (4th Edition)

    Here is some functions and their expected complexities. Numbers are indicating statement execution frequencies.

    Following Big-O Complexity Chart also taken from bigocheatsheet

    Lastly very simple showcase there is shows how it is calculated;

    Anatomy of a program’s statement execution frequencies.

    Analyzing the running time of a program (example).

    0 讨论(0)
提交回复
热议问题