I am learning about Big O Notation running times and amortized times. I understand the notion of O(n) linear time, meaning that the size of the input affects the g
If you had a function that takes:
1 millisecond to complete if you have 2 elements.
2 milliseconds to complete if you have 4 elements.
3 milliseconds to complete if you have 8 elements.
4 milliseconds to complete if you have 16 elements.
...
n milliseconds to complete if you have 2^n elements.
Then it takes log2(n) time. The Big O notation, loosely speaking, means that the relationship only needs to be true for large n, and that constant factors and smaller terms can be ignored.
The best way I've always had to mentally visualize an algorithm that runs in O(log n) is as follows:
If you increase the problem size by a multiplicative amount (i.e. multiply its size by 10), the work is only increased by an additive amount.
Applying this to your binary tree question so you have a good application: if you double the number of nodes in a binary tree, the height only increases by 1 (an additive amount). If you double it again, it still only increased by 1. (Obviously I'm assuming it stays balanced and such). That way, instead of doubling your work when the problem size is multiplied, you're only doing very slightly more work. That's why O(log n) algorithms are awesome.
It simply means that the time needed for this task grows with log(n) (example : 2s for n = 10, 4s for n = 100, ...). Read the Wikipedia articles on Binary Search Algorithm and Big O Notation for more precisions.
O(log n) is a bit misleading, more precisely it's O(log2 n), i.e. (logarithm with base 2).
The height of a balanced binary tree is O(log2 n), since every node has two (note the "two" as in log2 n) child nodes. So, a tree with n nodes has a height of log2 n.
Another example is binary search, which has a running time of O(log2 n) because at every step you divide the search space by 2.
log x to base b = y
is the inverse of b^y = x
If you have an M-ary tree of depth d and size n, then:
traversing the whole tree ~ O(M^d) = O(n)
Walking a single path in the tree ~ O(d) = O(log n to base M)
If you plot a logarithmic function on a graphical calculator or something similar, you'll see that it rises really slowly -- even more slowly than a linear function.
This is why algorithms with a logarithmic time complexity are highly sought after: even for really big n (let's say n = 10^8, for example), they perform more than acceptably.