So I\'ve been trying to understand Big O notation as well as I can, but there are still some things I\'m confused about. So I keep reading that if something is O(n), it usua
I find it easier to think of O()
as about ratios than about bounds. It is defined as bounds, and so that is a valid way to think of it, but it seems a bit more useful to think about "if I double the number/size of inputs to my algorithm, does my processing time double (O(n)
), quadruple (O(n^2)
), etc...". Thinking about it that way makes it a little bit less abstract - at least to me...