I work as a programmer, but have no computer science background, so recently I\'ve been following along with the excellent MIT OpenCourseWare intro to Computer Science and Progr
"Constant time" has the same meaning as "O(1)", which doesn't mean that an algorithm runs in a fixed amount of time, it just means that it isn't proportional to the length/size/magnitude of the input. i.e., for any input, it can be computed in the same amount of time (even if that amount of time is really long).