Creating an iPhone music Visualiser based on Fourier Transform
问题 I am designing a music visualiser application for the iPhone. I was thinking of doing this by picking up data via the iPhone's mic, running a Fourier Transform on it and then creating visualisations. The best example I have been able to get of this is aurioTuch which produces a perfect graph based on FFT data. However I have been struggling to understand / replicate aurioTouch in my own project. I am unable to understand where exactly aurioTouch picks up the data from the microphone before it