I can give it floating point numbers, such as
time.sleep(0.5)
but how accurate is it? If i give it
time.sleep(0.05)
Here's my follow-up to Wilbert's answer: the same for Mac OS X Yosemite, since it's not been mentioned much yet.
Looks like a lot of the time it sleeps about 1.25 times the time that you request and sometimes sleeps between 1 and 1.25 times the time you request. It almost never (~twice out of 1000 samples) sleeps significantly more than 1.25 times the time you request.
Also (not shown explicitly) the 1.25 relationship seems to hold pretty well until you get below about 0.2 ms, after which it starts get a little fuzzy. Additionally, the actual time seems to settle to about 5 ms longer than you request after the amount of time requested gets above 20 ms.
Again, it appears to be a completely different implementation of sleep()
in OS X than in Windows or whichever Linux kernal Wilbert was using.