Generating a tone in iOS with 16 bit PCM, AudioEngine.connect() throws AUSetFormat: error -10868

痞子三分冷 提交于 2019-12-21 05:44:09

问题


I have the following code for generating an audio tone of given frequency and duration. It's loosely based on this answer for doing the same thing on Android (thanks: @Steve Pomeroy):

https://stackoverflow.com/a/3731075/973364

import Foundation
import CoreAudio
import AVFoundation
import Darwin

   class AudioUtil {

    class func play(frequency: Int, durationMs: Int) -> Void {
        let sampleRateHz: Double = 8000.0
        let numberOfSamples = Int((Double(durationMs) / 1000 * sampleRateHz))
        let factor: Double = 2 * M_PI / (sampleRateHz/Double(frequency))

        // Generate an array of Doubles.
        var samples = [Double](count: numberOfSamples, repeatedValue: 0.0)

        for i in 1..<numberOfSamples {
            let sample = sin(factor * Double(i))
            samples[i] = sample
        }

        // Convert to a 16 bit PCM sound array.
        var index = 0
        var sound = [Byte](count: 2 * numberOfSamples, repeatedValue: 0)

        for doubleValue in samples {
            // Scale to maximum amplitude. Int16.max is 37,767.
            var value = Int16(doubleValue * Double(Int16.max))

            // In a 16 bit wav PCM, first byte is the low order byte.
            var firstByte = Int16(value & 0x00ff)
            var secondByteHighOrderBits = Int32(value) & 0xff00
            var secondByte = Int16(secondByteHighOrderBits >> 8) // Right shift.

            // println("\(doubleValue) -> \(value) -> \(firstByte), \(secondByte)")

            sound[index++] = Byte(firstByte)
            sound[index++] = Byte(secondByte)
        }

        let format = AVAudioFormat(commonFormat: AVAudioCommonFormat.PCMFormatInt16, sampleRate: sampleRateHz, channels:AVAudioChannelCount(1), interleaved: false)
        let buffer = AudioBuffer(mNumberChannels: 1, mDataByteSize: UInt32(sound.count), mData: &sound)
        let pcmBuffer = AVAudioPCMBuffer(PCMFormat: format, frameCapacity: AVAudioFrameCount(sound.count))
        let audioEngine = AVAudioEngine()
        let audioPlayer = AVAudioPlayerNode()

        audioEngine.attachNode(audioPlayer)
        // Runtime error occurs here:
        audioEngine.connect(audioPlayer, to: audioEngine.mainMixerNode, format: format)
        audioEngine.startAndReturnError(nil)

        audioPlayer.play()
        audioPlayer.scheduleBuffer(pcmBuffer, atTime: nil, options: nil, completionHandler: nil)
    }
}

The error I get at runtime when calling connect() on the AVAudioEngine is this:

ERROR:     [0x3bfcb9dc] AVAudioNode.mm:521: AUSetFormat: error -10868
*** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'error -10868'

Is what I'm generating not really AVAudioCommonFormat.PCMFormatInt16?

[EDIT]

Here's another, simpler attempt using only one buffer as PCMFormatFloat32. There's no error, but no sound either.

import AVFoundation

class AudioManager:NSObject {

    let audioPlayer = AVAudioPlayerNode()

    lazy var audioEngine: AVAudioEngine = {
        let engine = AVAudioEngine()

        // Must happen only once.
        engine.attachNode(self.audioPlayer)

        return engine
    }()

    func play(frequency: Int, durationMs: Int, completionBlock:dispatch_block_t!) {
        var error: NSError?

        var mixer = audioEngine.mainMixerNode
        var sampleRateHz: Float = Float(mixer.outputFormatForBus(0).sampleRate)
        var numberOfSamples = AVAudioFrameCount((Float(durationMs) / 1000 * sampleRateHz))

        var format = AVAudioFormat(commonFormat: AVAudioCommonFormat.PCMFormatFloat32, sampleRate: Double(sampleRateHz), channels: AVAudioChannelCount(1), interleaved: false)

        var buffer = AVAudioPCMBuffer(PCMFormat: format, frameCapacity: numberOfSamples)
        buffer.frameLength = numberOfSamples

        // Generate sine wave
        for var i = 0; i < Int(buffer.frameLength); i++ {
            var val = sinf(Float(frequency) * Float(i) * 2 * Float(M_PI) / sampleRateHz)

            // log.debug("val: \(val)")

            buffer.floatChannelData.memory[i] = val * 0.5
        }

        // Audio engine
        audioEngine.connect(audioPlayer, to: mixer, format: format)

        log.debug("Sample rate: \(sampleRateHz), samples: \(numberOfSamples), format: \(format)")

        if !audioEngine.startAndReturnError(&error) {
            log.debug("Error: \(error)")
        }

        // Play player and buffer
        audioPlayer.play()
        audioPlayer.scheduleBuffer(buffer, atTime: nil, options: nil, completionHandler: completionBlock)
    }
}

Thanks: Thomas Royal (http://www.tmroyal.com/playing-sounds-in-swift-audioengine.html)


回答1:


The problem was that when falling out of the play() function, the player was getting cleaned up and never completed (or barely started) playing. Here's one fairly clumsy solution to that: sleep for as long as the sample before returning from play().

I'll accept a better answer that avoids having to do this by not having the player cleaned up if anyone wants to post one.

import AVFoundation

class AudioManager: NSObject, AVAudioPlayerDelegate {

    let audioPlayerNode = AVAudioPlayerNode()

    var waveAudioPlayer: AVAudioPlayer?

    var playing: Bool! = false

    lazy var audioEngine: AVAudioEngine = {
        let engine = AVAudioEngine()

        // Must happen only once.
        engine.attachNode(self.audioPlayerNode)

        return engine
    }()

    func playWaveFromBundle(filename: String, durationInSeconds: NSTimeInterval) -> Void {
        var error: NSError?
        var sound = NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource(filename, ofType: "wav")!)

        if error != nil {
            log.error("Error: \(error)")
            return
        }

        self.waveAudioPlayer = AVAudioPlayer(contentsOfURL: sound, error: &error)
        self.waveAudioPlayer!.delegate = self

        AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, error: &error)

        if error != nil {
            log.error("Error: \(error)")
            return
        }

        log.verbose("Playing \(sound)")

        self.waveAudioPlayer!.prepareToPlay()

        playing = true

        if !self.waveAudioPlayer!.play() {
            log.error("Failed to play")
        }

        // If we don't block here, the player stops as soon as this function returns. While we'd prefer to wait for audioPlayerDidFinishPlaying() to be called here, it's never called if we block here. Instead, pass in the duration of the wave file and simply sleep for that long.
        /*
        while (playing!) {
            NSThread.sleepForTimeInterval(0.1) // seconds
        }
        */

        NSThread.sleepForTimeInterval(durationInSeconds)

        log.verbose("Done")
    }

    func play(frequency: Int, durationInMillis: Int, completionBlock:dispatch_block_t!) -> Void {
        var session = AVAudioSession.sharedInstance()
        var error: NSError?

        if !session.setCategory(AVAudioSessionCategoryPlayAndRecord, error: &error) {
            log.error("Error: \(error)")
            return
        }

        var mixer = audioEngine.mainMixerNode
        var sampleRateHz: Float = Float(mixer.outputFormatForBus(0).sampleRate)
        var numberOfSamples = AVAudioFrameCount((Float(durationInMillis) / 1000 * sampleRateHz))

        var format = AVAudioFormat(commonFormat: AVAudioCommonFormat.PCMFormatFloat32, sampleRate: Double(sampleRateHz), channels: AVAudioChannelCount(1), interleaved: false)

        var buffer = AVAudioPCMBuffer(PCMFormat: format, frameCapacity: numberOfSamples)
        buffer.frameLength = numberOfSamples

        // Generate sine wave
        for var i = 0; i < Int(buffer.frameLength); i++ {
            var val = sinf(Float(frequency) * Float(i) * 2 * Float(M_PI) / sampleRateHz)

            // log.debug("val: \(val)")

            buffer.floatChannelData.memory[i] = val * 0.5
        }

        AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, error: &error)

        if error != nil {
            log.error("Error: \(error)")
            return
        }

        // Audio engine
        audioEngine.connect(audioPlayerNode, to: mixer, format: format)

        log.debug("Sample rate: \(sampleRateHz), samples: \(numberOfSamples), format: \(format)")

        if !audioEngine.startAndReturnError(&error) {
            log.error("Error: \(error)")
            return
        }

        // TODO: Check we're not in the background. Attempting to play audio while in the background throws:
        //   *** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'error 561015905'

        // Play player and schedule buffer
        audioPlayerNode.play()
        audioPlayerNode.scheduleBuffer(buffer, atTime: nil, options: nil, completionHandler: completionBlock)

        // If we don't block here, the player stops as soon as this function returns.
        NSThread.sleepForTimeInterval(Double(durationInMillis) * 1000.0) // seconds
    }

    // MARK: AVAudioPlayerDelegate

    func audioPlayerDidFinishPlaying(player: AVAudioPlayer!, successfully flag: Bool) {
        log.verbose("Success: \(flag)")

        playing = false
    }

    func audioPlayerDecodeErrorDidOccur(player: AVAudioPlayer!, error: NSError!) {
        log.verbose("Error: \(error)")

        playing = false
    }

    // MARK: NSObject overrides

    deinit {
        log.verbose("deinit")
    }

}

For context, this AudioManager is a lazy loaded property on my AppDelegate:

lazy var audioManager: AudioManager = {
        return AudioManager()
    }()



回答2:


Try setting your session category to "AVAudioSessionCategoryPlay or AVAudioSessionCategoryPlayAndRecord." I'm using record and playback and calling it before the recording seems to work out fine. I'm guessing it has to go before you start connecting nodes.

        var session = AVAudioSession.sharedInstance()
    session.setCategory(AVAudioSessionCategoryPlayAndRecord, error: &error)



回答3:


Regarding the issue of not getting sound, even when using PCMFormatFloat32:

I've wrestled with the same issue for a few days now and finally found the (or at least one) problem: you need to manually set the frameLength of the PCM Buffer:

pcmBuffer.frameLength = AVAudioFrameCount(sound.count/2)

The division by two account for the two bytes per frame (16 bit encoded in two bytes).

Besides that, another change that I made, and which I don't yet know whether it matters or not is that I made the AVAudioEngine and the AVAudioPlayerNode members of the class, so as to avoid them being destroyed before playback ends.




回答4:


I have been encountering the same behaviour like you, that means I was helping myself with NSThread.sleepForTimeInterval(). Right now I figured out the solution, which works for me. The point is, that the AudioEngine() object needs to be initialised out of the function Play(). It has to be initialised on the class level, so the engine can work and play the sound even after the function quits (which is immediately). Right after I moved the line initialising the AudioEngine, the sound can be heard even without the waiting "helper". Hope it will help you.




回答5:


To get the right number of samples(numberOfSamples): mixer.outputFormatForBus(0).sampleRate gives back 44100.0 Multiply by 1000 is not necessary in the second example.

For me first call play() und afterwards set scheduleBuffer on playernode seems not logical. I would do revers.



来源:https://stackoverflow.com/questions/28058777/generating-a-tone-in-ios-with-16-bit-pcm-audioengine-connect-throws-ausetform

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!