AVPlayer playback of single channel audio stereo->mono

前端 未结 1 635
无人及你
无人及你 2020-12-17 04:18

In my iPad/iPhone App I\'m playing back a video using AVPlayer. The video file has a stereo audio track but I need to playback only one channel of this track in mono. Deploy

相关标签:
1条回答
  • 2020-12-17 05:16

    I now finally found an answer to this question - at least for deployment on iOS 6. You can easily add an MTAudioProcessingTap to your existing AVPlayer item and copy the selected channels samples to the other channel during your process callback function. Here is a great tutorial explaining the basics: http://chritto.wordpress.com/2013/01/07/processing-avplayers-audio-with-mtaudioprocessingtap/

    This is my code so far, mostly copied from the link above.

    During AVPlayer setup I assign callback functions for audio processing:

    MTAudioProcessingTapCallbacks callbacks;
    callbacks.version = kMTAudioProcessingTapCallbacksVersion_0;
    callbacks.clientInfo = ( void *)(self);
    callbacks.init = init;
    callbacks.prepare = prepare;
    callbacks.process = process;
    callbacks.unprepare = unprepare;
    callbacks.finalize = finalize;
    
    MTAudioProcessingTapRef tap;
    // The create function makes a copy of our callbacks struct
    OSStatus err = MTAudioProcessingTapCreate(kCFAllocatorDefault, &callbacks,
                                              kMTAudioProcessingTapCreationFlag_PostEffects, &tap);
    if (err || !tap) {
        NSLog(@"Unable to create the Audio Processing Tap");
        return;
    }
    assert(tap);
    
    // Assign the tap to the input parameters
    audioInputParam.audioTapProcessor = tap;
    
    // Create a new AVAudioMix and assign it to our AVPlayerItem
    AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
    audioMix.inputParameters = @[audioInputParam];
    playerItem.audioMix = audioMix;
    

    Here are the audio processing functions (actually process is the only one needed):

    #pragma mark Audio Processing
    
    void init(MTAudioProcessingTapRef tap, void *clientInfo, void **tapStorageOut) {
        NSLog(@"Initialising the Audio Tap Processor");
        *tapStorageOut = clientInfo;
    }
    
    void finalize(MTAudioProcessingTapRef tap) {
        NSLog(@"Finalizing the Audio Tap Processor");
    }
    
    void prepare(MTAudioProcessingTapRef tap, CMItemCount maxFrames, const AudioStreamBasicDescription *processingFormat) {
        NSLog(@"Preparing the Audio Tap Processor");
    }
    
    void unprepare(MTAudioProcessingTapRef tap) {
        NSLog(@"Unpreparing the Audio Tap Processor");
    }
    
    void process(MTAudioProcessingTapRef tap, CMItemCount numberFrames,
             MTAudioProcessingTapFlags flags, AudioBufferList *bufferListInOut,
             CMItemCount *numberFramesOut, MTAudioProcessingTapFlags *flagsOut) {
        OSStatus err = MTAudioProcessingTapGetSourceAudio(tap, numberFrames, bufferListInOut, flagsOut, NULL, numberFramesOut);
        if (err) NSLog(@"Error from GetSourceAudio: %ld", err);
    
        SIVSViewController* self = (SIVSViewController*) MTAudioProcessingTapGetStorage(tap);
    
        if (self.selectedChannel) {
    
            int channel = self.selectedChannel;
    
            if (channel == 0) {
                bufferListInOut->mBuffers[1].mData = bufferListInOut->mBuffers[0].mData;
            } else {
                bufferListInOut->mBuffers[0].mData = bufferListInOut->mBuffers[1].mData;
            }
        }
    }
    
    0 讨论(0)
提交回复
热议问题