How to apply a Vignette CIFilter to a live camera feed in iOS?

后端 未结 1 2009
夕颜
夕颜 2020-11-29 12:09

While trying to apply a simple vignette filter to the raw camera feed of an iPhone6, with the help of Metal and Core Image, I see a lot of lag between the frames being proce

相关标签:
1条回答
  • 2020-11-29 12:30

    Your step 2 is way too slow to support real-time rendering... and it looks like you're missing a couple of steps. For your purpose, you would typically:

    Setup:

    1. create a pool of CVPixelBuffer - using CVPixelBufferPoolCreate
    2. create a pool of metal textures using CVMetalTextureCacheCreate

    For each frame:

    1. convert CMSampleBuffer > CVPixelBuffer > CIImage
    2. Pass that CIImage through your filter pipeline
    3. render the output image into a CVPixelBuffer from the pool created in step 1
    4. use CVMetalTextureCacheCreateTextureFromImage to create a metal texture with your filtered CVPixelBuffer

    If setup correctly, all these steps will make sure your image data stays on the GPU, as opposed to travelling from GPU to CPU and back to GPU for display.

    The good news is all this is demoed in the AVCamPhotoFilter sample code from Apple https://developer.apple.com/library/archive/samplecode/AVCamPhotoFilter/Introduction/Intro.html#//apple_ref/doc/uid/TP40017556. In particular see the RosyCIRenderer class and its superclass FilterRenderer.

    0 讨论(0)
提交回复
热议问题