asymptotic-complexity

Which Big-O grows faster asymptotically

懵懂的女人 提交于 2019-12-23 14:54:09
问题 I have gotten into an argument/debate recently and I am trying to get a clear verdict of the correct solution. It is well known that n! grows very quickly, but exactly how quickly , enough to "hide" all additional constants that might be added to it? Let's assume I have this silly & simple program (no particular language): for i from 0 to n! do: ; // nothing Given that the input is n , then the complexity of this is obviously O(n!) (or even ϴ(n!) but this isn't relevant here). Now let's

How can an algorithm that is O(n) also be O(n^2), O(n^1000000), O(2^n)?

萝らか妹 提交于 2019-12-22 10:34:11
问题 So the answer to this question What is the difference between Θ(n) and O(n)? states that "Basically when we say an algorithm is of O(n), it's also O(n 2 ), O(n 1000000 ), O(2 n ), ... but a Θ(n) algorithm is not Θ(n 2 )." I understand Big O to represent upper bound or worst case with that I don't understand how O(n) is also O(n 2 ) and the other cases worse than O(n). Perhaps I have some fundamental misunderstandings. Please help me understand this as I have been struggling for a while.

Merge sort worst case running time for lexicographic sorting?

a 夏天 提交于 2019-12-21 21:25:22
问题 A list of n strings each of length n is sorted into lexicographic order using the merge sort algorithm. The worst case running time of this computation is? I got this question as a homework. I know merge sort sorts in O(nlogn) time. For lexicographic order for length in is it n times nlogn ? or n^2 ? 回答1: Each comparison of the algorithm is O(n) [comparing two strings is O(n) worst case - you might detect which is "bigger" only on the last character], You have O(nlogn) comparisons in

complexity for nested loops

|▌冷眼眸甩不掉的悲伤 提交于 2019-12-21 17:55:50
问题 I am trying to figure out the complexity of a for loop using Big O notation. I have done this before in my other classes, but this one is more rigorous than the others because it is on the actual algorithm. The code is as follows: for(i=n ; i>1 ; i/=2) //for any size n { for(j = 1; j < i; j++) { x+=a } } and for(i=1 ; i<=n;i++,x=1) //for any size n { for(j = 1; j <= i; j++) { for(k = 1; k <= j; x+=a,k*=a) { } } } I have arrived that the first loop is of O(n) complexity because it is going

Graph In-degree Calculation from Adjacency-list

旧时模样 提交于 2019-12-21 16:49:46
问题 I came across this question in which it was required to calculate in-degree of each node of a graph from its adjacency list representation. for each u for each Adj[i] where i!=u if (i,u) ∈ E in-degree[u]+=1 Now according to me its time complexity should be O(|V||E|+|V|^2) but the solution I referred instead described it to be equal to O(|V||E|) . Please help and tell me which one is correct. 回答1: Rather than O(|V||E|), the complexity of computing indegrees is O(|E|). Let us consider the

Graph In-degree Calculation from Adjacency-list

谁说我不能喝 提交于 2019-12-21 16:49:10
问题 I came across this question in which it was required to calculate in-degree of each node of a graph from its adjacency list representation. for each u for each Adj[i] where i!=u if (i,u) ∈ E in-degree[u]+=1 Now according to me its time complexity should be O(|V||E|+|V|^2) but the solution I referred instead described it to be equal to O(|V||E|) . Please help and tell me which one is correct. 回答1: Rather than O(|V||E|), the complexity of computing indegrees is O(|E|). Let us consider the

Are 2^n and 4^n in the same Big-Θ complexity class?

为君一笑 提交于 2019-12-21 13:00:55
问题 Is 2^n = Θ(4^n)? I'm pretty sure that 2^n is not in Ω(4^n) thus not in Θ(4^n), but my university tutor says it is. This confused me a lot and I couldn't find a clear answer per Google. 回答1: 2^n is NOT big-theta (Θ) of 4^n , this is because 2^n is NOT big-omega (Ω) of 4^n . By definition, we have f(x) = Θ(g(x)) if and only if f(x) = O(g(x)) and f(x) = Ω(g(x)) . Claim 2^n is not Ω(4^n) Proof Suppose 2^n = Ω(4^n) , then by definition of big-omega there exists constants c > 0 and n0 such that: 2

Role of lower order terms in big O notation

一笑奈何 提交于 2019-12-18 18:08:37
问题 In big O notation, we always say that we should ignore constant factors for most cases. That is, rather than writing, 3n^2-100n+6 we are almost always satisfied with n^2 since that term is the fastest growing term in the equation. But I found many algorithm courses starts comparing functions with many terms 2n^2+120n+5 = big O of n^2 then finding c and n0 for those long functions, before recommending to ignore low order terms in the end. My question is what would I get from trying to

Complexity of inserting n numbers into a binary search tree

风格不统一 提交于 2019-12-18 17:29:47
问题 I have got a question, and it says "calculate the tight time complexity for the process of inserting n numbers into a binary search tree". It does not denote whether this is a balanced tree or not. So, what answer can be given to such a question? If this is a balanced tree, then height is logn, and inserting n numbers take O(nlogn) time. But this is unbalanced, it may take even O(n 2 ) time in the worst case. What does it mean to find the tight time complexity of inserting n numbers to a bst?

Asymptotic Complexity of Logarithms and Powers

前提是你 提交于 2019-12-18 05:27:16
问题 So, clearly, log(n) is O(n). But, what about (log(n))^2? What about sqrt(n) or log(n)--what bounds what? There's a family of comparisons like this: n^a versus (log(n))^b I run into these comparisons a lot, and I've never come up with a good way to solve them. Hints for tactics for solving the general case? Thanks, Ian EDIT: I'm not talking about the computational complexity of calculating the values of these functions. I'm talking about the functions themselves. E.g., f(n)=n is an upper bound