asymptotic-complexity

Big O of clojure library functions

a 夏天 提交于 2019-12-04 05:37:15
Can anyone point me to a resource that lists the Big-O complexity of basic clojure library functions such as conj, cons, etc.? I know that Big-O would vary depending on the type of the input, but still, is such a resource available? I feel uncomfortable coding something without having a rough idea of how quickly it'll run. Here is a table composed by John Jacobsen and taken from this discussion : Late to the party here, but I found the link in the comments of the first answer to be more definitive, so I'm reposting it here with a few modifications (that is, english->big-o ): Table Markdown

Constants in the formal definition of Big O

好久不见. 提交于 2019-12-04 04:04:17
问题 I'm revising the formal definitions of Big O and the other associated bounds and something is tripping me up. In the book I'm reading (Skiena) Big O is defined as: f(n) = O(g(n)) when there exists a constant c such that f(n) is always <= c*g(n) for a some value of n > n0 This generally makes sense to me. We are only concerned with large enough values of n that the growth rates actually matter. But why multiply g(n) by c? It seem like I could choose a very large value for c, and make the whole

Confused on big O notation

只愿长相守 提交于 2019-12-04 01:54:30
问题 According to this book, big O means: f(n) = O(g(n)) means c · g(n) is an upper bound on f(n). Thus there exists some constant c such that f(n) is always ≤ c · g(n), for large enough n (i.e. , n ≥ n0 for some constant n0). I have trubble understanding the following big O equation 3n2 − 100n + 6 = O(n2), because I choose c = 3 and 3n2 > 3n2 − 100n + 6; How can 3 be a factor? In 3n2 − 100n + 6, if we drop the low order terms -100n and 6, aren't 3n2 and 3.n2 the same? How to solve this equation?

Asymptotic complexity for typical expressions

房东的猫 提交于 2019-12-03 21:00:51
The increasing order of following functions shown in the picture below in terms of asymptotic complexity is: (A) f1(n); f4(n); f2(n); f3(n) (B) f1(n); f2(n); f3(n); f4(n); (C) f2(n); f1(n); f4(n); f3(n) (D) f1(n); f2(n); f4(n); f3(n) a)time complexity order for this easy question was given as--->(n^0.99)*(logn) < n ......how? log might be a slow growing function but it still grows faster than a constant b)Consider function f1 suppose it is f1(n) = (n^1.0001)(logn) then what would be the answer? whenever there is an expression which involves multiplication between logarithimic and polynomial

Why is the Big-O complexity of this algorithm O(n^2)?

社会主义新天地 提交于 2019-12-03 08:04:28
问题 I know the big-O complexity of this algorithm is O(n^2) , but I cannot understand why. int sum = 0; int i = 1; j = n * n; while (i++ < j--) sum++; Even though we set j = n * n at the beginning, we increment i and decrement j during each iteration, so shouldn't the resulting number of iterations be a lot less than n*n ? 回答1: During every iteration you increment i and decrement j which is equivalent to just incrementing i by 2. Therefore, total number of iterations is n^2 / 2 and that is still

Big O for worst-case running time and Ω is for the best-case, but why is Ω used in worst case sometimes?

假装没事ソ 提交于 2019-12-03 06:23:22
问题 I'm confused, I thought that you use Big O for worst-case running time and Ω is for the best-case? Can someone please explain? And isn't (lg n) the best-case? and (nlg n) is the worst case? Or am I misunderstanding something? Show that the worst-case running time of Max-Heapify on a heap of size n is Ω(lg n). ( Hint: For a heap with n nodes, give node values that cause Max-Heapify to be called recursively at every node on a path from the root down to a leaf.) Edit: no this is not homework. im

asymptotic tight bound for quadratic functions

跟風遠走 提交于 2019-12-03 06:01:25
问题 In CLRS (Introduction to Algorithms by Cormen, Leiserson, Rivest, and Stein), for a function f ( n ) = an 2 + bn + c they said Suppose we take the constants c 1 = a /4, c 2 = 7 a /4, and n 0 = 2·max(| b |/ a , √(| c |/ a )). Then 0 ≤ c 1 n 2 ≤ an 2 + bn + c ≤ c 2 n 2 for all n ≥ n 0 . Therefore f ( n ) is Θ( n 2 ). But they didn't specify how values of these constants came ? I tried to prove it but couldn't. Please tell me how these constants came ? 回答1: There's nothing special about those

Asymptotically optimal algorithm to compute if a line intersects a convex polygon

筅森魡賤 提交于 2019-12-03 05:51:07
问题 An O(n) algorithm to detect if a line intersects a convex polygon consists in checking if any edge of the polygon intersects the line, and look if the number of intersections is odd or even. Is there an asymptotically faster algorithm, e.g. an O(log n) one? 回答1: lhf's answer is close to correct. Here is a version that should fix the problem with his. Let the polygon have vertices v0, v1, ..., vn in counterclockwise order. Let the points x0 and x1 be on the line. Note two things: First,

Big O Notation of an expression

依然范特西╮ 提交于 2019-12-03 04:33:18
If I have an algorithm that takes 4n^2 + 7n moves to accomplish, what is its O? O(4n^2)? O(n^2)? I know that 7n is cut off, but I don't know if I should keep the n^2 coefficient or not. Thanks You should drop any coefficients because the question is really asking "on the order of", which tries to characterize it as linear, exponential, logarithmic, etc... That is, when n is very large, the coefficient is of little importance. This also explains why you drop the +7n, because when n is very large, that term has relatively little significance to the final answer. If you are familiar with calculus

How can I find a number which occurs an odd number of times in a SORTED array in O(n) time?

混江龙づ霸主 提交于 2019-12-03 01:38:16
问题 I have a question and I tried to think over it again and again... but got nothing so posting the question here. Maybe I could get some view-point of others, to try and make it work... The question is: we are given a SORTED array, which consists of a collection of values occurring an EVEN number of times, except one, which occurs ODD number of times. We need to find the solution in log n time. It is easy to find the solution in O(n) time, but it looks pretty tricky to perform in log n time.