How to Convert CMSampleBuffer/UIImage into ffmpeg's AVPicture?

前端 未结 1 998
失恋的感觉
失恋的感觉 2021-01-31 23:25

I\'m trying to encode iPhone\'s camera frames into a H.264 video using ffmpeg\'s libav* libraries. I found in this Apple\'s article how to convert CMSampleBuffer to UIImage, but

相关标签:
1条回答
  • 2021-01-31 23:50

    Answering my own question:

    CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CVPixelBufferLockBaseAddress(pixelBuffer, 0);
    
    // access the data
    int width = CVPixelBufferGetWidth(pixelBuffer);
    int height = CVPixelBufferGetHeight(pixelBuffer);
    unsigned char *rawPixelBase = (unsigned char *)CVPixelBufferGetBaseAddress(pixelBuffer);
    
    // Do something with the raw pixels here
    // ...
    
    // Fill in the AVFrame
    CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
    
    AVFrame *pFrame;
    pFrame = avcodec_alloc_frame();
    
    avpicture_fill((AVPicture*)pFrame, rawPixelBase, PIX_FMT_RGB32, width, height);
    

    Now pFrame is filled in with the content of sample buffer, which is using the pixel format kCVPixelFormatType_32BGRA.

    This solved my issue. Thanks.

    0 讨论(0)
提交回复
热议问题