Performance issues when using AVCaptureVideoDataOutput and AVCaptureAudioDataOutput

一曲冷凌霜 提交于 2020-01-10 09:13:10

问题


I'm having lag issues when I'm recording audio+video by using AVCaptureVideoDataOutput and AVCaptureAudioDataOutput. Sometimes the video blocks for a few milliseconds, sometimes the audio is not in sync with the video.

I inserted some logs and observed that first I get a lot of video buffers in captureOutput callback, and after some time I get the audio buffers(sometimes I don't receive the audio buffers at all, and the resulting video is without sound). If I comment the code that handles the video buffers, I get the audio buffers without problems.

This is the code I'm using:

-(void)initMovieOutput:(AVCaptureSession *)captureSessionLocal
{   
    AVCaptureVideoDataOutput *dataOutput = [[AVCaptureVideoDataOutput alloc] init];
    self._videoOutput = dataOutput;
    [dataOutput release];

    self._videoOutput.alwaysDiscardsLateVideoFrames = NO;
    self._videoOutput.videoSettings = [NSDictionary dictionaryWithObject: [NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange]
                                                             forKey:(id)kCVPixelBufferPixelFormatTypeKey
                                  ];     
    AVCaptureAudioDataOutput *audioOutput =  [[AVCaptureAudioDataOutput alloc] init];
    self._audioOutput = audioOutput;
    [audioOutput release];

    [captureSessionLocal addOutput:self._videoOutput];
    [captureSessionLocal addOutput:self._audioOutput];


    // Setup the queue
    dispatch_queue_t queue = dispatch_queue_create("MyQueue", NULL);
    [self._videoOutput setSampleBufferDelegate:self queue:queue];
    [self._audioOutput setSampleBufferDelegate:self queue:queue];
    dispatch_release(queue);
}

Here I set up the writer:

-(BOOL) setupWriter:(NSURL *)videoURL session:(AVCaptureSession *)captureSessionLocal
{
    NSError *error = nil;
    self._videoWriter = [[AVAssetWriter alloc] initWithURL:videoURL fileType:AVFileTypeQuickTimeMovie
                                                         error:&error];
    NSParameterAssert(self._videoWriter);


    // Add video input  
    NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                   AVVideoCodecH264, AVVideoCodecKey,
                                   [NSNumber numberWithInt:640], AVVideoWidthKey,
                                   [NSNumber numberWithInt:480], AVVideoHeightKey,
                                   nil];

    self._videoWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo
                                                                         outputSettings:videoSettings];


    NSParameterAssert(self._videoWriterInput);
    self._videoWriterInput.expectsMediaDataInRealTime = YES;
    self._videoWriterInput.transform = [self returnOrientation];

    // Add the audio input
    AudioChannelLayout acl;
    bzero( &acl, sizeof(acl));
    acl.mChannelLayoutTag = kAudioChannelLayoutTag_Mono;


    NSDictionary* audioOutputSettings = nil;          
    // Both type of audio inputs causes output video file to be corrupted.

        // should work on any device requires more space
        audioOutputSettings = [ NSDictionary dictionaryWithObjectsAndKeys:                       
                               [ NSNumber numberWithInt: kAudioFormatAppleLossless ], AVFormatIDKey,
                               [ NSNumber numberWithInt: 16 ], AVEncoderBitDepthHintKey,
                               [ NSNumber numberWithFloat: 44100.0 ], AVSampleRateKey,
                               [ NSNumber numberWithInt: 1 ], AVNumberOfChannelsKey,                                      
                               [ NSData dataWithBytes: &acl length: sizeof( acl ) ], AVChannelLayoutKey,
                               nil ];

    self._audioWriterInput = [AVAssetWriterInput 
                                        assetWriterInputWithMediaType: AVMediaTypeAudio 
                                        outputSettings: audioOutputSettings ];

    self._audioWriterInput.expectsMediaDataInRealTime = YES;    

    // add input
    [self._videoWriter addInput:_videoWriterInput];
    [self._videoWriter addInput:_audioWriterInput];

    return YES;
}

And here is the callback:

- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection
{

    if( !CMSampleBufferDataIsReady(sampleBuffer) )
    {
        NSLog( @"sample buffer is not ready. Skipping sample" );
        return;
    }
    if( _videoWriter.status !=  AVAssetWriterStatusCompleted )
    {
        if( _videoWriter.status != AVAssetWriterStatusWriting  )
        {               
            CMTime lastSampleTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer); 
            [_videoWriter startWriting];
            [_videoWriter startSessionAtSourceTime:lastSampleTime];
        }

        if( captureOutput == _videoOutput )
        {
            if( [self._videoWriterInput isReadyForMoreMediaData] )
            {

            [self newVideoSample:sampleBuffer];

            }
        }
        else if( captureOutput == _audioOutput )
        {
            if( [self._audioWriterInput isReadyForMoreMediaData] )
            {

                 [self newAudioSample:sampleBuffer];


            }
        }
    }

}

-(void) newAudioSample:(CMSampleBufferRef)sampleBuffer
{

        if( _videoWriter.status > AVAssetWriterStatusWriting )
        {

            [self NSLogPrint:[NSString stringWithFormat:@"Audio:Warning: writer status is %d", _videoWriter.status]];
            if( _videoWriter.status == AVAssetWriterStatusFailed )
                [self NSLogPrint:[NSString stringWithFormat:@"Audio:Error: %@", _videoWriter.error]];
            return;
        }

        if( ![_audioWriterInput appendSampleBuffer:sampleBuffer] )
            [self NSLogPrint:[NSString stringWithFormat:@"Unable to write to audio input"]];

}

-(void) newVideoSample:(CMSampleBufferRef)sampleBuffer
{
    if( _videoWriter.status > AVAssetWriterStatusWriting )
    {
        [self NSLogPrint:[NSString stringWithFormat:@"Video:Warning: writer status is %d", _videoWriter.status]];
        if( _videoWriter.status == AVAssetWriterStatusFailed )
            [self NSLogPrint:[NSString stringWithFormat:@"Video:Error: %@", _videoWriter.error]];
        return;
    }


    if( ![_videoWriterInput appendSampleBuffer:sampleBuffer] )
        [self NSLogPrint:[NSString stringWithFormat:@"Unable to write to video input"]];
}

Is there something wrong in my code, why does the video lag? (I'm testing it on a Iphone 4 ios 4.2.1)


回答1:


It looks like you are using serial queues. The audio Output queue is right after the video output queue. Consider using concurrent queues.



来源:https://stackoverflow.com/questions/11274652/performance-issues-when-using-avcapturevideodataoutput-and-avcaptureaudiodataout

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!