iOS: captureOutput:didOutputSampleBuffer:fromConnection is NOT called

狂风中的少年 提交于 2019-11-30 10:24:28

I've done a lot of experimenting with this and I think I have the answer probably. I have similar but different code that's written from the ground up rather than being copied from Apple's samples (which are a bit old now).

I think it's the section...

AVCaptureMovieFileOutput *movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
    if ([session canAddOutput:movieFileOutput])
    {
        [session addOutput:movieFileOutput];
        AVCaptureConnection *connection = [movieFileOutput connectionWithMediaType:AVMediaTypeVideo];
        if ([connection isVideoStabilizationSupported])
            [connection setEnablesVideoStabilizationWhenAvailable:YES];
        [self setMovieFileOutput:movieFileOutput];
    }

From my experiments, this is the thing that causes your problem. In my code, when this is there captureOutput:didOutputSampleBuffer:fromConnection is not called. I think the video system EITHER gives you a series of sample buffers OR records a compressed, optimised movie file to disk, not both. (At least on iOS.) I guess this makes sense/is not surprising but I have not seen it documented anywhere!

Also, at one point, I seemed to be getting errors and/or the buffer callback not occurring when I had the microphone on. Again undocumented, these were error -11800 (unknown error). But I cannot always reproduce that.

Your code looks good to me and I could think of 10 guess-and-check things you could try so I'll take a different approach that will hopefully indirectly fix the issue. Besides the fact that I think AVCam is poorly written, I think it would be better for you to see a an example that only focuses on live video rather than recording video and taking still images. I have provided an example that does just that and no more.

-(void)startSession {
    self.session = [AVCaptureSession new];
    self.session.sessionPreset = AVCaptureSessionPresetMedium;
    AVCaptureDevice *backCamera;
    for (AVCaptureDevice *device in [AVCaptureDevice devices]) {
        if ([device hasMediaType:AVMediaTypeVideo] && device.position == AVCaptureDevicePositionBack) {
            backCamera = device;
            break;
        }
    }
    NSError *error = nil;
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:backCamera error:&error];
    if (error) {
        // handle error
    }
    if ([self.session canAddInput:input]) {
        [self.session addInput:input];
    }
    AVCaptureVideoDataOutput *output = [AVCaptureVideoDataOutput new];
    [output setSampleBufferDelegate:self queue:self.queue];
    output.videoSettings = @{(id)kCVPixelBufferPixelFormatTypeKey:@(kCVPixelFormatType_32BGRA)};
    if ([self.session canAddOutput:output]) {
        [self.session addOutput:output];
    }
    dispatch_async(self.queue, ^{
        [self.session startRunning];
    });
}

I had same issue when I was working on a bridge between React-Native and native iOS/Swif/ObjectiveC.

Then I found 2 similar questions. @Carl's answer seem to be correct indeed. Then I found other question with answer:

I have contacted an engineer at Apple's support and he told me that simultaneous AVCaptureVideoDataOutput + AVCaptureMovieFileOutput use is not supported. I don't know if they will support it in the future, but he used the word "not supported at this time".

I encourage you to fill a bug report / feature request on this, as I did (bugreport.apple.com), as they measure how hard people want something and we perhaps can see this in a near future.

Simultaneous AVCaptureVideoDataOutput and AVCaptureMovieFileOutput

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!