Get microphone input using Audio Queue in Swift 3

流过昼夜 提交于 2019-12-12 03:27:48

问题


I am developing an app that records voice via built-in microphone and sends it to a server live. So I need to get the byte stream from the microphone while recording.

After googling and stack-overflowing for quite a while, I think I figured out how it should work, but it does not. I think using Audio Queues might be the way to go.

Here is what I tried so far:

func test() {
    func callback(_ a :UnsafeMutableRawPointer?, _ b : AudioQueueRef, _ c :AudioQueueBufferRef, _ d :UnsafePointer<AudioTimeStamp>, _ e :UInt32, _ f :UnsafePointer<AudioStreamPacketDescription>?) {
        print("test")
    }

    var inputQueue: AudioQueueRef? = nil

    var aqData = AQRecorderState(
        mDataFormat: AudioStreamBasicDescription(
            mSampleRate: 16000,
            mFormatID: kAudioFormatLinearPCM,
            mFormatFlags: 0,
            mBytesPerPacket: 2,
            mFramesPerPacket: 1,     // Must be set to 1 for uncomressed formats
            mBytesPerFrame: 2,
            mChannelsPerFrame: 1,    // Mono recording
            mBitsPerChannel: 2 * 8,  // 2 Bytes
            mReserved: 0),  // Must be set to 0 according to https://developer.apple.com/reference/coreaudio/audiostreambasicdescription
        mQueue: inputQueue!,
        mBuffers: [AudioQueueBufferRef](),
        bufferByteSize: 32,
        mCurrentPacket: 0,
        mIsRunning: true)

    var error = AudioQueueNewInput(&aqData.mDataFormat,
                                   callback,
                                   nil,
                                   nil,
                                   nil,
                                   0,
                                   &inputQueue)
    AudioQueueStart(inputQueue!, nil)
}

It compiles and the app starts, but as soon as I call test() I get an exception:

fatal error: unexpectedly found nil while unwrapping an Optional value

The exception is caused by

mQueue: inputQueue!

I understand why this happens (inputQueue has no value) but I don't know how to initialise inputQueue correctly. The problem is that Audio Queues are very poorly documented for Swift users and I didn't find any working example on the internet.

Can anybody tell me what I am doing wrong?


回答1:


Use AudioQueueNewInput(...) (or output) to initialize your audio queue before you are using it:

let sampleRate = 16000
let numChannels = 2
var inFormat = AudioStreamBasicDescription(
        mSampleRate:        Double(sampleRate),
        mFormatID:          kAudioFormatLinearPCM,
        mFormatFlags:       kAudioFormatFlagsNativeFloatPacked,
        mBytesPerPacket:    UInt32(numChannels * MemoryLayout<UInt32>.size),
        mFramesPerPacket:   1,
        mBytesPerFrame:     UInt32(numChannels * MemoryLayout<UInt32>.size),
        mChannelsPerFrame:  UInt32(numChannels),
        mBitsPerChannel:    UInt32(8 * (MemoryLayout<UInt32>.size)),
        mReserved:          UInt32(0)

var inQueue: AudioQueueRef? = nil
AudioQueueNewInput(&inFormat, callback, nil, nil, nil, 0, &inQueue)

var aqData = AQRecorderState(
    mDataFormat:    inFormat, 
    mQueue:         inQueue!, // inQueue is initialized now and can be unwrapped
    mBuffers: [AudioQueueBufferRef](),
    bufferByteSize: 32,
    mCurrentPacket: 0,
    mIsRunning:     true)

Find details in Apples Documentation




回答2:


This code from our project works fine:

AudioBuffer * buff; 
AudioQueueRef queue;
AudioStreamBasicDescription  fmt = { 0 };


static void HandleInputBuffer (
                               void                                 *aqData,
                               AudioQueueRef                        inAQ,
                               AudioQueueBufferRef                  inBuffer,
                               const AudioTimeStamp                 *inStartTime,
                               UInt32                               inNumPackets,
                               const AudioStreamPacketDescription   *inPacketDesc

                               ) {

 }



- (void) initialize  {


    thisClass = self;

    __block struct AQRecorderState aqData;

    NSError * error;

    fmt.mFormatID         = kAudioFormatLinearPCM; 
    fmt.mSampleRate       = 44100.0;               
    fmt.mChannelsPerFrame = 1;                     
    fmt.mBitsPerChannel   = 16;                    
    fmt.mChannelsPerFrame = 1;
    fmt.mFramesPerPacket  = 1;
    fmt.mBytesPerFrame = sizeof (SInt16);
    fmt.mBytesPerPacket = sizeof (SInt16);


    fmt.mFormatFlags =  kLinearPCMFormatFlagIsSignedInteger  | kLinearPCMFormatFlagIsPacked;




    OSStatus status = AudioQueueNewInput (                              // 1
                        &fmt,                          // 2
                        HandleInputBuffer,                            // 3
                        &aqData,                                      // 4
                        NULL,                                         // 5
                        kCFRunLoopCommonModes,                        // 6
                        0,                                            // 7
                        &queue                                // 8
                        );



    AudioQueueBufferRef  buffers[kNumberBuffers];
    UInt32 bufferByteSize = kSamplesSize;
    for (int i = 0; i < kNumberBuffers; ++i) {           // 1
        OSStatus allocateStatus;
        allocateStatus =  AudioQueueAllocateBuffer (                       // 2
                                  queue,                               // 3
                                  bufferByteSize,                       // 4
                                  &buffers[i]                          // 5
                                  );
        OSStatus  enqueStatus;
        NSLog(@"allocateStatus = %d" , allocateStatus);
        enqueStatus =   AudioQueueEnqueueBuffer (                        // 6
                                 queue,                               // 7
                                 buffers[i],                          // 8
                                 0,                                           // 9
                                 NULL                                         // 10
                                 );
        NSLog(@"enqueStatus = %d" , enqueStatus);
    }



    AudioQueueStart (                                    // 3
                     queue,                                   // 4
                     NULL                                             // 5
                     );


}


来源:https://stackoverflow.com/questions/43519077/get-microphone-input-using-audio-queue-in-swift-3

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!