How to grab YUV formatted video from the camera, display it and process it

后端 未结 1 721
感情败类
感情败类 2021-02-01 10:24

I am writing an iphone (IOS 4) program that capture live video from the camera and process it in real time.

I prefer to capture in kCVPixelFormatType_420YpCbCr8BiPlanarF

1条回答
  •  梦毁少年i
    2021-02-01 10:56

    Answering my own question. this solved the problem I had (which was to grab yuv output, display it and process it), although its not exactly the answer to the question:

    To grab YUV output from the camera:

    AVCaptureVideoDataOutput *videoOut = [[AVCaptureVideoDataOutput alloc] init];
    [videoOut setAlwaysDiscardsLateVideoFrames:YES];
    [videoOut setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] forKey:(id)kCVPixelBufferPixelFormatTypeKey]];
    

    To display it as is, use AVCaptureVideoPreviewLayer, it does not require any much code. (You can see the FindMyiCon sample in the WWDC samples pack for example).

    To process the YUV y channel (bi-planer in this case so it's all in a single chunk, you can also use memcpy instead of looping) :

    - (void)processPixelBuffer: (CVImageBufferRef)pixelBuffer {
    
        CVPixelBufferLockBaseAddress( pixelBuffer, 0 );
    
        int bufferHeight = CVPixelBufferGetHeight(pixelBuffer);
        int bufferWidth = CVPixelBufferGetWidth(pixelBuffer);
    
        // allocate space for ychannel, reallocating as needed.
        if (bufferWidth != y_channel.width || bufferHeight != y_channel.height)
        {
            if (y_channel.data) free(y_channel.data);
            y_channel.width = bufferWidth;
            y_channel.height = bufferHeight;
            y_channel.data = malloc(y_channel.width * y_channel.height);        
        }
    
        uint8_t *yc = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0);
        int total = bufferWidth * bufferHeight;
        for(int k=0;k

    0 讨论(0)
提交回复
热议问题