I often here people talk about Big O which measures algorithms against each other
Does this measure clock cycles or space requirements.
If people want to contras
Big O and others are used to measure the growth of something.
When someone says that something is O(N)
, then that thing grows no faster than linear rate. If something is Ω(N^2)
, then that thing grows no slower than quadratic rate. When something is Θ(2^N)
, then that thing grows at an exponential rate.
What that thing is can be the time requirement for an algorithm. It can also be the space i.e. memory requirement for an algorithm. It can also be pretty much anything, related to neither space nor time.
For example, some massively parallel algorithms often measure the scalability in the number of processors that it can run on. A particular algorithm may run on O(N)
processors in O(N^2)
time. Another algorithm may run on O(N^2)
processors in O(N)
time.