I am creating a metronome as part of a larger app and I have a few very short wav files to use as the individual sounds. I would like to use AVAudioEngine because NSTimer has si
I think that one of possible ways to have sounds played at with lowest possible time error is providing audio samples directly via callback. In iOS you could do this with AudioUnit
.
In this callback you could track sample count and know at what sample you are now. From sample counter you could go to time value (using sample rate) and use it for your high level tasks like metronome. If you see that it is time to play metronome sound then you just starting to copy audio samples from that sound to buffer.
This is a theoretic part without any code, but you could find many examples of AudioUnit
and callback technique.
I was able to make a buffer containing sound from file and silence of required length. Hope this will help:
// audioFile here – an instance of AVAudioFile initialized with wav-file
func tickBuffer(forBpm bpm: Int) -> AVAudioPCMBuffer {
audioFile.framePosition = 0 // position in file from where to read, required if you're read several times from one AVAudioFile
let periodLength = AVAudioFrameCount(audioFile.processingFormat.sampleRate * 60 / Double(bpm)) // tick's length for given bpm (sound length + silence length)
let buffer = AVAudioPCMBuffer(PCMFormat: audioFile.processingFormat, frameCapacity: periodLength)
try! audioFile.readIntoBuffer(buffer) // sorry for forcing try
buffer.frameLength = periodLength // key to success. This will append silcence to sound
return buffer
}
// player – instance of AVAudioPlayerNode within your AVAudioEngine
func startLoop() {
player.stop()
let buffer = tickBuffer(forBpm: bpm)
player.scheduleBuffer(buffer, atTime: nil, options: .Loops, completionHandler: nil)
player.play()
}
To expand upon 5hrp's answer:
Take the simple case where you have two beats, an upbeat (tone1) and a downbeat (tone2), and you want them out of phase with each other so the audio will be (up, down, up, down) to a certain bpm.
You will need two instances of AVAudioPlayerNode (one for each beat), let's call them audioNode1 and audioNode2
The first beat you will want to be in phase, so setup as normal:
let buffer = tickBuffer(forBpm: bpm)
audioNode1player.scheduleBuffer(buffer, atTime: nil, options: .loops, completionHandler: nil)
then for the second beat you want it to be exactly out of phase, or to start at t=bpm/2. for this you can use an AVAudioTime variable:
audioTime2 = AVAudioTime(sampleTime: AVAudioFramePosition(AVAudioFrameCount(audioFile2.processingFormat.sampleRate * 60 / Double(bpm) * 0.5)), atRate: Double(1))
you can use this variable in the buffer like so:
audioNode2player.scheduleBuffer(buffer, atTime: audioTime2, options: .loops, completionHandler: nil)
This will play on loop your two beats, bpm/2 out of phase from each other!
It's easy to see how to generalise this to more beats, to create a whole bar. It's not the most elegant solution though, because if you want to say do 16th notes you'd have to create 16 nodes.