From my basic understanding, JavaScript audio visualizers are reflecting the music based on the actual sound waves. I would like to build something like a metronome (http://bl.
This post might be relevant:
The gist is that you run a function in your setInterval()
slightly faster than your tempo, for example, every 100ms. Long example short, you can track whether or not it's time to play a "beat" by checking the value of (new Date()).getMilliseconds()
and seeing if the equivalent of one beat in milliseconds has passed instead of relying on the not-so-accurate setTimeout
or setInterval
functions.
Even with that, music itself, unless generated by a computer, might not have perfect or consistent tempo, so accounting for mistimed beats could be a hurdle for you, which may be why using audio analysis to find where the actual beats are going to happen could be a better route.