I\'m just not sure...
If you have a code that can be executed in either of the following complexities:
I think there are two issues here; first what the notation says, second what you would actually measure on real programs
big O is defiend as a limit as n -> infinity so in terms of big O, O(n) < O(n^2) is always true regardless of any finite constants.
as others have pointed out real programs only ever deal with some finite input, so it is quite possible to pick a small enough value for n such that the c*n > n^2 i.e. c > n, however you are strictly speaking no longer dealing with big O