I know this isn\'t strictly a programming question, but it is a computer science question so I\'m hoping someone can help me.
I\'ve been working on my Algor
http://en.wikipedia.org/wiki/Big_O_notation
N repetitions of g(m)=O(f(m)) is O(N*f(m))
for any f(N).
Sum of i=1..N of i*g(i) is O(N*f(N))
if g(n)=O(f(n)) and f is monotonic.
Definition: g(n)=O(f(n)) if some c,m exist so for all n>m, g(n)<=c*f(n)
The sum is for i=1..N of i*f(i)
.
If f is monotonic in i this means every term is <= if(N) <= Nf(N). So the sum is less than N*c*f(N)
so the sum is O(N*f(N))
(witnessed by the same c,m that makes g(n)=O(f(n)))
Of course, log_2(x) and x^2 are both monotonic.