问题
I have created a video chat app for groups in iOS. I have been searching for some ways to control the audio volume for different participant separately. I found way to mute and unmute using isPlaybackEnabled
in RemoteAudioTrack
, but not to control volume.
I also thought if we can use it in AVAudioPlayer
. I found addSink
. This is what I tried from here:
class Audio: NSObject, AudioSink {
var a = 1
func renderSample(_ audioSample: CMSampleBuffer!) {
print("audio found", a)
a += 1
var audioBufferList = AudioBufferList()
var data = Data()
var blockBuffer : CMBlockBuffer?
CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(audioSample, bufferListSizeNeededOut: nil, bufferListOut: &audioBufferList, bufferListSize: MemoryLayout<AudioBufferList>.size, blockBufferAllocator: nil, blockBufferMemoryAllocator: nil, flags: 0, blockBufferOut: &blockBuffer)
let buffers = UnsafeBufferPointer<AudioBuffer>(start: &audioBufferList.mBuffers, count: Int(audioBufferList.mNumberBuffers))
for audioBuffer in buffers {
let frame = audioBuffer.mData?.assumingMemoryBound(to: UInt8.self)
data.append(frame!, count: Int(audioBuffer.mDataByteSize))
}
let player = try! AVAudioPlayer(data: data) //crash here
player.play()
}
}
But It crashed on let player = try! AVAudioPlayer(data: data)
.
EDIT:
This is the error: Fatal error: 'try!' expression unexpectedly raised an error: Error Domain=NSOSStatusErrorDomain Code=-39 "(null)": file
.
This is data
so I guess it is not converted:
▿ 0 bytes
- count : 0
▿ pointer : 0x000000016d7ae160
- pointerValue : 6131736928
- bytes : 0 elements
And this is the audioSample
:
<CMAudioFormatDescription 0x2815a3de0 [0x1bb2ef830]> {
mediaType:'soun'
mediaSubType:'lpcm'
mediaSpecific: {
ASBD: {
mSampleRate: 16000.000000
mFormatID: 'lpcm'
mFormatFlags: 0xc
mBytesPerPacket: 2
mFramesPerPacket: 1
mBytesPerFrame: 2
mChannelsPerFrame: 1
mBitsPerChannel: 16 }
cookie: {(null)}
ACL: {(null)}
FormatList Array: {(null)}
}
extensions: {(null)}
}
回答1:
You can get the full data buffer from CMSampleBuffer and convert it to Data:
let blockBuffer = CMSampleBufferGetDataBuffer(sampleBuffer)
let blockBufferDataLength = CMBlockBufferGetDataLength(blockBuffer!)
var blockBufferData = [UInt8](repeating: 0, count: blockBufferDataLength)
let status = CMBlockBufferCopyDataBytes(blockBuffer!, atOffset: 0, dataLength: blockBufferDataLength, destination: &blockBufferData)
guard status == noErr else { return }
let data = Data(bytes: blockBufferData, count: blockBufferDataLength)
And also refer to AVAudioPlayer overview:
Use this class for audio playback unless you are playing audio captured from a network stream or require very low I/O latency.
So I don't think it will work for you. You should better use AVAudioEngine or Audio Queue Services.
回答2:
Try saving the audio file to the document directory and then play the sound. This works for me.
func playMusic() {
let url = NSBundle.mainBundle().URLForResource("Audio", withExtension: "mp3")!
let data = NSData(contentsOfURL: url)!
AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, error: nil)
AVAudioSession.sharedInstance().setActive(true, error: nil)
audioPlayer = AVAudioPlayer(data: data, fileTypeHint: AVFileTypeMPEGLayer3, error: nil)
audioPlayer.prepareToPlay()
audioPlayer.play()
}
来源:https://stackoverflow.com/questions/57075466/play-audio-from-cmsamplebuffer