asymptotic-complexity

Why is the Big-O complexity of this algorithm O(n^2)?

断了今生、忘了曾经 提交于 2019-12-02 23:15:12
I know the big-O complexity of this algorithm is O(n^2) , but I cannot understand why. int sum = 0; int i = 1; j = n * n; while (i++ < j--) sum++; Even though we set j = n * n at the beginning, we increment i and decrement j during each iteration, so shouldn't the resulting number of iterations be a lot less than n*n ? During every iteration you increment i and decrement j which is equivalent to just incrementing i by 2. Therefore, total number of iterations is n^2 / 2 and that is still O(n^2). Ben Rubin big-O complexity ignores coefficients. For example: O(n) , O(2n) , and O(1000n) are all

Asymptotically optimal algorithm to compute if a line intersects a convex polygon

人走茶凉 提交于 2019-12-02 19:09:22
An O(n) algorithm to detect if a line intersects a convex polygon consists in checking if any edge of the polygon intersects the line, and look if the number of intersections is odd or even. Is there an asymptotically faster algorithm, e.g. an O(log n) one? lhf's answer is close to correct. Here is a version that should fix the problem with his. Let the polygon have vertices v0, v1, ..., vn in counterclockwise order. Let the points x0 and x1 be on the line. Note two things: First, finding the intersection of two lines (and determining its existence) can be done in constant time. Second,

Collatz conjecture: loose upper/lower bounds? [closed]

﹥>﹥吖頭↗ 提交于 2019-12-02 16:50:03
问题 Closed. This question is off-topic. It is not currently accepting answers. Want to improve this question? Update the question so it's on-topic for Stack Overflow. Closed 6 years ago . This is a problem from my textbook. The Collatz conjecture (or the "3n + 1" problem) works as follows (given some natural number n ): while n > 1 do if n is even then n = n / 2 else n = 3n + 1 end if end while I've skimmed a few papers on the conjecture but they all went over my head. I'm trying to get a basic

How can I find a number which occurs an odd number of times in a SORTED array in O(n) time?

给你一囗甜甜゛ 提交于 2019-12-02 15:07:52
I have a question and I tried to think over it again and again... but got nothing so posting the question here. Maybe I could get some view-point of others, to try and make it work... The question is: we are given a SORTED array, which consists of a collection of values occurring an EVEN number of times, except one, which occurs ODD number of times. We need to find the solution in log n time. It is easy to find the solution in O(n) time, but it looks pretty tricky to perform in log n time. Theorem : Every deterministic algorithm for this problem probes Ω(log 2 n) memory locations in the worst

Collatz conjecture: loose upper/lower bounds? [closed]

核能气质少年 提交于 2019-12-02 12:24:03
This is a problem from my textbook. The Collatz conjecture (or the "3n + 1" problem) works as follows (given some natural number n ): while n > 1 do if n is even then n = n / 2 else n = 3n + 1 end if end while I've skimmed a few papers on the conjecture but they all went over my head. I'm trying to get a basic understanding of the algorithm's complexity. Is it possible to comment on an upper or lower bound for the number of operations performed (in the worst case)? The only thing I've been able to deduce is that a best-case input must be of the form n = 2^k (which will result in the fewest ops

Can not figure out complexity of this recurrence

旧城冷巷雨未停 提交于 2019-12-01 23:06:57
问题 I am refreshing on Master Theorem a bit and I am trying to figure out the running time of an algorithm that solves a problem of size n by recursively solving 2 subproblems of size n-1 and combine solutions in constant time. So the formula is: T(N) = 2T(N - 1) + O(1) But I am not sure how can I formulate the condition of master theorem. I mean we don't have T(N/b) so is b of the Master Theorem formula in this case b=N/(N-1) ? If yes since obviously a > b^k since k=0 and is O(N^z) where z=log2

Can not figure out complexity of this recurrence

懵懂的女人 提交于 2019-12-01 22:20:37
I am refreshing on Master Theorem a bit and I am trying to figure out the running time of an algorithm that solves a problem of size n by recursively solving 2 subproblems of size n-1 and combine solutions in constant time. So the formula is: T(N) = 2T(N - 1) + O(1) But I am not sure how can I formulate the condition of master theorem. I mean we don't have T(N/b) so is b of the Master Theorem formula in this case b=N/(N-1) ? If yes since obviously a > b^k since k=0 and is O(N^z) where z=log2 with base of (N/N-1) how can I make sense out of this? Assuming I am right so far? ah, enough with the

When is an algorithm O(n + m) time?

限于喜欢 提交于 2019-12-01 14:42:19
I was solving this problem on Hacker rank. My algorithm to solve the problem is: Get an array of all the player scores. Iterate through all player scores and create a new array. Let there be n players in all. which doesn't contain any repeated player scores. Let us call the new array, playerScores. Let total levels to be played by Alice is m. Let Alice's score after first round be S. Let Alice's initial rank R be 0. Start iterating playerScores array from the rear end until you get a player score whose score is less than S. Set R to rank of the player found in step 5. Reduce m by 1. Print R.

When is an algorithm O(n + m) time?

醉酒当歌 提交于 2019-12-01 12:12:07
问题 I was solving this problem on Hacker rank. My algorithm to solve the problem is: Get an array of all the player scores. Iterate through all player scores and create a new array. Let there be n players in all. which doesn't contain any repeated player scores. Let us call the new array, playerScores. Let total levels to be played by Alice is m. Let Alice's score after first round be S. Let Alice's initial rank R be 0. Start iterating playerScores array from the rear end until you get a player

Complexity of the recursion: T(n) = T(n-1) + T(n-2) + C

非 Y 不嫁゛ 提交于 2019-12-01 04:32:58
I want to understand how to arrive at the complexity of the below recurrence relation. T(n) = T(n-1) + T(n-2) + C Given T(1) = C and T(2) = 2C; Generally for equations like T(n) = 2T(n/2) + C (Given T(1) = C), I use the following method. T(n) = 2T(n/2) + C => T(n) = 4T(n/4) + 3C => T(n) = 8T(n/8) + 7C => ... => T(n) = 2^k T (n/2^k) + (2^k - 1) c Now when n/2^k = 1 => K = log (n) (to the base 2) T(n) = n T(1) + (n-1)C = (2n -1) C = O(n) But, I'm not able to come up with similar approach for the problem I have in question. Please correct me if my approach is incorrect. The complexity is related