I have been googling and researching for days but I can\'t seem to get this to work and I can\'t find any solution to it on the internet.
I am trying to cap
You were very close! You were capturing audio in the didOutputSampleBuffer
callback, but that's a high frequency callback so you were creating a lot of AVAudioPlayer
s and passing them raw LPCM data, while they only know how to parse CoreAudio file types and then they were going out of scope anyway.
You can very easily play the buffers you're capturing with AVCaptureSession
using AVAudioEngine
's AVAudioPlayerNode
, but at that point you may as well use AVAudioEngine
to record from the microphone too:
import UIKit
import AVFoundation
class ViewController: UIViewController {
var engine = AVAudioEngine()
override func viewDidLoad() {
super.viewDidLoad()
let input = engine.inputNode!
let player = AVAudioPlayerNode()
engine.attach(player)
let bus = 0
let inputFormat = input.inputFormat(forBus: bus)
engine.connect(player, to: engine.mainMixerNode, format: inputFormat)
input.installTap(onBus: bus, bufferSize: 512, format: inputFormat) { (buffer, time) -> Void in
player.scheduleBuffer(buffer)
}
try! engine.start()
player.play()
}
}