When writing a JavaScript animation, you of course make a loop using setInterval (or repeated setTimeout). But what is the best delay to use in the setInterval/setTimeout ca
The pseudo-code for this is this one:
FPS_WANTED = 25
(just a number, it can be changed while executing, or it can be constant)
TIME_OF_DRAWING = 1000/FPS_WANTED
(this is in milliseconds, I believe it is accurate enough)
( should be updated when FPS_WANTED changes)
UntilTheUserLeavesTheDrawingApplication()
{
time1 = getTime();
doAnimation();
time2 = getTime();
animationTime = time2-time1;
if (animationTime > TIME_OF_DRAWING)
{
[the FPS_WANTED cannot be reached]
You can:
1. Decrease the number of FPS to see if a lower framerate can be achieved
2. Do nothing because you want to get all you can from the CPU
}
else
{
[the FPS can be reached - you can decide to]
1. wait(TIME_OF_DRAWING-animationTime) - to keep a constant framerate of FPS_WANTED
2. increase framerate if you want
3. Do nothing because you want to get all you can from the CPU
}
}
Of course there can be variations of this but this is the basic algorithm that is valid in any case of animation.
I would venture to say that a substantial fraction of web users are using monitors that refresh at 60Hz, which translates to one frame every 16.66ms. So to make the monitor the bottleneck, you need to produce frames faster than 16.66ms.
There are two reasons you would pick a value like 13ms. First, the browser needs a little bit of time to repaint the screen (in my experience, never less than 1ms). Which puts you at, say, updating every 15ms, which happens to be a very interesting number - the standard timer resolution on Windows is 15ms (see John Resig's blog post). I suspect that an well-written 15ms animation looks very close to the same on a wide variety of browsers/operating systems.
FWIW, fbogner is plain wrong about non-Chrome browsers firing setInterval every 20-30ms. I wrote a test to measure the speed of setInterval firing, and got these numbers:
The number told by fbogner have been tested. The browsers throttle the js-activity to a certain degree to be usable every time.
If your javascript would be possible to run every 5msec the browser runtime would have much less cpu time to refresh the rendering or react on user input (clicks) because javascript-execution blocks the browser.
I think the chrome-devs allow you to run your javascript at much shorter intervals than the other browsers because their V8-Javascript-Engine compiles the JavaScript and therefore it runs faster and the browser will noch be blocked as long as with interpreted js-code.
But the engine is not only so much faster to allow shorter intervals the devs have certainly tested which is the best possible shortest interval to allow short intervals and don't blocking the browser for to long
Don't know the reasoning behind jQuery's interval time, as 13ms translates to 80fps which is very fast. The "standard" fps that's used in movies and such is 25fps and is fast enough that human eye won't notice any jittering. 25fps translates to 40ms, so to answer your question: anything below 40ms is enough for an animation.
When doing loops for animations, it's best that you find a balance between the speed of the loop, and how much work needs to be done.
For example, if you want to slide a div across the page within a second so it is a nice effect and timely. You would skip coordinates and have a reasonably fast loop time so the effect is noticeable, but not jumpy.
So it's a trial and error thing (by having to put work, time, and browser capability into account). So it doesn't only look nice on one browser.