Complexity's and Run times

后端 未结 1 744
轻奢々
轻奢々 2021-01-29 03:21

I tried looking around to see if my answer could be answered but I haven\'t stumbled what could help me.

When Dealing with Run Time Complexity\'s do you account for the

相关标签:
1条回答
  • 2021-01-29 04:15

    I suggest you read on what the O notation means.But let me present you with a brief overview:

    When we say f(x)=O(g(x)), we mean that for some constant c independent of input size,

    f(x)<=c.g(x) for all x>=k
    

    in other words, beyond a certain point k, the curve f(n) is always bounded above by the curve g(n) as shown in the figure.

    enter image description here

    Now in the case you have considered, the operations of addition and subtraction, multiplication are all primitive operations that take constant time(O(1)). Let's say the addition of two numbers takes 'a' time and assigning the result takes 'b' time.

    So for this code:

      for (i=0;i<n;i++)
       for (j=0;j<n;j++)
         a[i,j]=b[i,j]+c[i,j]
    

    Let's be sloppy and ignore the for loop operations of assignment and update. The running time T(n)=(a+b)n2.

    Notice that this is just O(n2), why?

    As per the definition, this means we can identify some point k beyond which for some constant c the curve T(n) is always bounded above by n2.

    Realize that this is indeed true.We can always pick sufficiently large constants so that c.n^2 curve always bounds above the given curve.

    This is why people say:Drop the constants!

    Bottomline: Big O of f(n) is g(n) means, to the right of some vertical line, the curve f(n) is always bounded above by g(n).

    0 讨论(0)
提交回复
热议问题