问题
My app is basically a phone call over MultipeerConnectivity.
Here is how I'm setting up the audio session: Note that recordingSession is of type AVAudioSession and captureSession is of type AVCaptureSession.
func setupAVRecorder() {
print("\(#file) > \(#function) > Entry")
do {
try recordingSession.setCategory(AVAudioSessionCategoryPlayAndRecord)
try recordingSession.setMode(AVAudioSessionModeVoiceChat)
try recordingSession.setPreferredSampleRate(44100.00)
try recordingSession.setPreferredIOBufferDuration(0.2)
try recordingSession.setActive(true)
recordingSession.requestRecordPermission() { [unowned self] (allowed: Bool) -> Void in
DispatchQueue.main.async {
if allowed {
do {
self.captureDevice = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeAudio)
try self.captureDeviceInput = AVCaptureDeviceInput.init(device: self.captureDevice)
self.outputDevice = AVCaptureAudioDataOutput()
self.outputDevice?.setSampleBufferDelegate(self, queue: DispatchQueue.main)
self.captureSession = AVCaptureSession()
self.captureSession.addInput(self.captureDeviceInput)
self.captureSession.addOutput(self.outputDevice)
self.captureSession.startRunning()
}
catch let error {
print("\(#file) > \(#function) > ERROR: \(error.localizedDescription)")
}
}
}
}
}
catch let error {
print("\(#file) > \(#function) > ERROR: \(error.localizedDescription)")
}
}
Then I've got the captureOutput function from the AVCaptureAudioDataOutputSampleBufferDelegate. In this method I write the data to the outputStream.
func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {
var blockBuffer: CMBlockBuffer?
var audioBufferList: AudioBufferList = AudioBufferList.init()
CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(sampleBuffer, nil, &audioBufferList, MemoryLayout<AudioBufferList>.size, nil, nil, kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment, &blockBuffer)
let buffers = UnsafeMutableAudioBufferListPointer(&audioBufferList)
for buffer in buffers {
let u8ptr = buffer.mData!.assumingMemoryBound(to: UInt8.self)
let output = outputStream!.write(u8ptr, maxLength: Int(buffer.mDataByteSize))
}
}
I receive data in the func stream(_ aStream: Stream, handle eventCode: Stream.Event) which is from the InputStreamDelegate. In this method, when bytes are available, I call my own method readFromStream().
func readFromStream() {
while (inputStream!.hasBytesAvailable) {
var buffer = [UInt8](repeating: 0, count: 4096)
let length = inputStream!.read(&buffer, maxLength: buffer.count)
if (length > 0) {
let audioBuffer = bytesToAudioBuffer(buffer)
let mainMixer = audioEngine!.mainMixerNode
audioEngine!.connect(audioPlayer!, to: mainMixer, format: audioBuffer.format)
audioPlayer!.scheduleBuffer(audioBuffer, completionHandler: nil)
do {
try audioEngine!.start()
}
catch let error as NSError {
print("\(#file) > \(#function) > error: \(error.localizedDescription)")
}
audioPlayer!.play()
}
}
}
This method seems to run fine, except for the fact that audio isn't actually being played, there is just silence, but that's another issue. My issue right now, is that when I put print statements inside of these methods, one of the devices is constantly sending data, without receiving any, and the other device is constantly receiving data without sending it. I imagine this is because I'm using one thread to take care of both sending and receiving data, so one of the devices inputBuffers never gets cleared.
My guess is that the fix is to make one thread take care of recording and sending and another to take care of receiving and playing?
I'm only fairly familiar with threads, but not familiar with how threads work in Swift, so if that's the case can anyone guide me on how to do this?
Thank you so much for your time!
回答1:
You need to use DispatchQueues. Here is an example of how to make one:
let streamReceiverQueue = DispatchQueue(label: "someNameForQueue", qos: DispatchQoS.userInteractive)
The label is not important, it just must be unique if using multiple DispatchQueues. The qos is basically what priority is the queue, if you set it to userInteractive it is the most important, and will execute things quickly.
You can find more info on qos here
Then if you want the queue to execute things synchronously than execute it using
streamReceiverQueue.sync {
// Here is what will execute on the streamReceiverQueue
}
And if you want things to execute asynchronously than execute it using:
streamReceiverQueue.async {
// Your code here
}
来源:https://stackoverflow.com/questions/42031790/ios-streaming-and-receiving-audio-from-a-device-to-another-ends-in-one-only-se