I\'ve used the FFT data from the Analyser node using the getByteFrequencyData
method in the Web Audio API to create a spectrum visualizer as shown below:
With 256 bins, each one will be ~86 Hz apart (44100 kHz sample rate / fftSize, where fftSize is twice the number of bins). So you start at zero and go up in 86 Hz increments from there.
The actual values in the bins are just a representation of how much of each frequency is present in the signal (i.e. how "loud" the frequency is).
yes,getByteFrequencyData
results in a normalized array of values between 0 and 255.
(it copies the data to the array it gets passed-in).
the frequency bands are split equally, so each element N of your array corresponds to:
N * samplerate/fftSize
so, the first bin is 0.
and, assuming a samplerate of 44100 and a <analyzerNode>.fftSize
of 512 the second would be: 86.13 Hz, and so on...
you will find these two questions and answers useful, on dsp and on SO:
Note that the length of your sampledata is half the <analyzerNode>.fftSize
, effectively limiting the frequency-range to half the samplerate.