asymptotic-complexity

Confused on how to find c and k for big O notation if f(x) = x^2+2x+1

狂风中的少年 提交于 2019-12-08 11:25:59
问题 I am studying big O notation from this book. The deffinition of big O notation is: We say that f (x) is O(g(x)) if there are constants C and k such that |f (x)| ≤ C|g(x)| whenever x > k. Now here is the first example: EXAMPLE 1 Show that f (x) = x^2 + 2x + 1 is O(x^2). Solution: We observe that we can readily estimate the size of f (x) when x > 1 because x 1. It follows that 0 ≤ x^2 + 2x + 1 ≤ x^2 + 2x^2 + x^2 = 4x^2 whenever x > 1. Consequently, we can take C = 4 and k = 1 as witnesses to

`repr` and `int` take quadratic time in Python

非 Y 不嫁゛ 提交于 2019-12-08 05:48:18
问题 I was making a table of different run-times for Python 2.7, and noticed a thing that I cannot explain: The run-time of repr(2**n) and int('1'*n) is O(n^2) . I always assumed that converting between integer and string would be O(n) with n being number of digits. The results show that if O(n) fitting gives ~30% error, while O(n^2) is only ~5%. Could anyone explain please. Here are the results that I get (codes are below): Test Number-1 -- time to compute int('1'*n) (fit to O(n**2)) Spec_string:

`repr` and `int` take quadratic time in Python

99封情书 提交于 2019-12-07 09:55:30
I was making a table of different run-times for Python 2.7, and noticed a thing that I cannot explain: The run-time of repr(2**n) and int('1'*n) is O(n^2) . I always assumed that converting between integer and string would be O(n) with n being number of digits. The results show that if O(n) fitting gives ~30% error, while O(n^2) is only ~5%. Could anyone explain please. Here are the results that I get (codes are below): Test Number-1 -- time to compute int('1'*n) (fit to O(n**2)) Spec_string: 1000<=n<=10000 by factors of 2 var_list ['n'] Function list: ('n**2', 'n', '1') run times: n = 1000 :

Asymptotic analysis of three nested for loops

泄露秘密 提交于 2019-12-07 03:06:04
问题 I want to calculate the theta complexity of this nested for loop: for (int i = 0; i < n; i++) { for (int j = 0; j < i; j++) { for (int k = 0; k < j; k++) { // statement I'd say it's n^3, but I don't think this is correct, because each for loop does not go from 1 to n. I did some tests: n = 5 -> 10 10 -> 120 30 -> 4060 50 -> 19600 So it must be between n^2 and n^3. I tried the summation formula and such, but my results are way too high. Though of n^2 log(n), but that's also wrong... 回答1: It is

How do I perform a deletion of the kth element on a min-max heap?

空扰寡人 提交于 2019-12-06 14:19:48
问题 A min-max heap can be useful to implement a double-ended priority queue because of its constant time find-min and find-max operations. We can also retrieve the minimum and maximum elements in the min-max heap in O(log 2 n) time. Sometimes, though, we may also want to delete any node in the min-max heap, and this can be done in O(log 2 n) , according to the paper which introduced min-max heaps: ... The structure can also be generalized to support the operation Find(k) (determine the kth

Difference between O(m+n) and O(mn)?

霸气de小男生 提交于 2019-12-06 11:19:56
问题 I was trying to find the complexities of an algorithm via different approaches. Mathematically I came across one O(m+n) and another O(mn) approach. However I am unable to grasp or say visualize this. It's not like I look at them and get the "Ahh! That's what's going on" feeling! Can someone explain this using their own examples or any other tool? 回答1: My recommendation for finding intuition is thought experiments as follows: First, realize that m and n are two different measurements of the

How can an algorithm that is O(n) also be O(n^2), O(n^1000000), O(2^n)?

杀马特。学长 韩版系。学妹 提交于 2019-12-06 04:50:42
So the answer to this question What is the difference between Θ(n) and O(n)? states that "Basically when we say an algorithm is of O(n), it's also O(n 2 ), O(n 1000000 ), O(2 n ), ... but a Θ(n) algorithm is not Θ(n 2 )." I understand Big O to represent upper bound or worst case with that I don't understand how O(n) is also O(n 2 ) and the other cases worse than O(n). Perhaps I have some fundamental misunderstandings. Please help me understand this as I have been struggling for a while. Thanks. It's helpful to think of what big-Oh means: if a function is O(n), then c*n , where c is some

How to solve for this recurrence T(n) = T(n − 1) + lg(1 + 1/n), T(1) = 1?

牧云@^-^@ 提交于 2019-12-05 12:40:35
I got stuck in this recurrence: T(n) = T(n − 1) + lg(1 + 1/n), T(1) = 1? for a while and it seems the master method cannot be applied on this one. We have: lg(1 + 1/n) = lg((n + 1) / n) = lg(n+1) - lg(n) Hence: T(n) - T(n - 1) = lg(n + 1) - lg(n) T(n-1) - T(n - 2) = lg(n) - lg(n - 1) ... T(3) - T(2) = lg(3) - lg(2) T(2) - T(1) = lg(2) - lg(1) Adding and eliminating, we get: T(n) - T(1) = lg(n + 1) - lg(1) = lg(n + 1) or T(n) = 1 + lg(n + 1) Hence T(n) = O(lg(n)) Same answer as the other correct answer here, just proved differently. All the following equations are created from the given

Asymptotic analysis of three nested for loops

时光总嘲笑我的痴心妄想 提交于 2019-12-05 07:50:16
I want to calculate the theta complexity of this nested for loop: for (int i = 0; i < n; i++) { for (int j = 0; j < i; j++) { for (int k = 0; k < j; k++) { // statement I'd say it's n^3, but I don't think this is correct, because each for loop does not go from 1 to n. I did some tests: n = 5 -> 10 10 -> 120 30 -> 4060 50 -> 19600 So it must be between n^2 and n^3. I tried the summation formula and such, but my results are way too high. Though of n^2 log(n), but that's also wrong... It is O(N^3) . The exact formula is (N*(N+1)*(N+2))/6 Using Sigma Notation is an efficient step by step

Asymptotic complexity for typical expressions

耗尽温柔 提交于 2019-12-05 04:06:09
问题 The increasing order of following functions shown in the picture below in terms of asymptotic complexity is: (A) f1(n); f4(n); f2(n); f3(n) (B) f1(n); f2(n); f3(n); f4(n); (C) f2(n); f1(n); f4(n); f3(n) (D) f1(n); f2(n); f4(n); f3(n) a)time complexity order for this easy question was given as--->(n^0.99)*(logn) < n ......how? log might be a slow growing function but it still grows faster than a constant b)Consider function f1 suppose it is f1(n) = (n^1.0001)(logn) then what would be the