I\'ll start with a simple \"playground\" view controller class I\'ve made that demonstrates my problem:
class AudioEnginePlaygroundViewController: UIViewCont
After some searching I've found the problem. The issue lies in the audio engine's inputNode singleton. From the docs:
The audio engine creates a singleton on demand when inputNode is first accessed. To receive input, connect another audio node from the output of the input audio node, or create a recording tap on it.
Plus a reference to the format issue I was experiencing:
Check the input format of input node (specifically, the hardware format) for a non-zero sample rate and channel count to see if input is enabled.
In my playground class, the flow for triggering audio file playback never accesses the engine's inputNode before it creates an "active chain" with:
audioEngine.connect(playerNode, to: audioEngine.outputNode, format: audioFile.processingFormat)
It seems that you must access AVAudioEngine's inputNode before start()ing it if you want the engine to internally configure itself for input. Even stop()ing and reset()ing the engine does not cause an access of inputNode to reconfigure the engine. (I suspect the manually breaking the active chain via disconnectNode calls would allow for the internal reconfiguration but I don't yet know that for sure).
So code-wise the fix was simple: simply access the engine's input node immediately after instantiation so that the engine is configured for audio input. Here's the entire class with both file playback and mic tapping working together:
import UIKit
class AudioEnginePlaygroundViewController: UIViewController {
private var audioEngine: AVAudioEngine!
private var mic: AVAudioInputNode!
private var micTapped = false
override func viewDidLoad() {
super.viewDidLoad()
configureAudioSession()
audioEngine = AVAudioEngine()
mic = audioEngine.inputNode!
}
@IBAction func toggleMicTap(_ sender: Any) {
if micTapped {
mic.removeTap(onBus: 0)
micTapped = false
return
}
let micFormat = mic.inputFormat(forBus: 0)
mic.installTap(onBus: 0, bufferSize: 2048, format: micFormat) { (buffer, when) in
let sampleData = UnsafeBufferPointer(start: buffer.floatChannelData![0], count: Int(buffer.frameLength))
}
micTapped = true
startEngine()
}
@IBAction func playAudioFile(_ sender: Any) {
stopAudioPlayback()
let playerNode = AVAudioPlayerNode()
let audioUrl = Bundle.main.url(forResource: "test_audio", withExtension: "wav")!
let audioFile = readableAudioFileFrom(url: audioUrl)
audioEngine.attach(playerNode)
audioEngine.connect(playerNode, to: audioEngine.outputNode, format: audioFile.processingFormat)
startEngine()
playerNode.scheduleFile(audioFile, at: nil, completionHandler: nil)
playerNode.play()
}
// MARK: Internal Methods
private func configureAudioSession() {
do {
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord, with: [.mixWithOthers, .defaultToSpeaker])
try AVAudioSession.sharedInstance().setActive(true)
} catch { }
}
private func readableAudioFileFrom(url: URL) -> AVAudioFile {
var audioFile: AVAudioFile!
do {
try audioFile = AVAudioFile(forReading: url)
} catch { }
return audioFile
}
private func startEngine() {
guard !audioEngine.isRunning else {
return
}
do {
try audioEngine.start()
} catch { }
}
private func stopAudioPlayback() {
audioEngine.stop()
audioEngine.reset()
}
}