Making CIContext.render(CIImage, CVPixelBuffer) work with AVAssetWriter

前端 未结 2 1057
南方客
南方客 2021-01-03 06:45

I want to use Core Image for processing a bunch of CGImage objects and turning them into a QuickTime movie on macOS. The following code demonst

2条回答
  •  执笔经年
    2021-01-03 07:36

    For your use case it would be better to use the pull-style APIs of AVAssetWriterInput because you don't need to process any media in real-time (like you would when capturing from a camera).

    So rather then pausing the thread when the input isn't ready, just wait for it to pull the next frame. Remember to also set expectsMediaDataInRealTime to false in this case.

    I think the main problem with you current approach is that you pause the very thread that the video processing is happening in when the writer is not yet ready.

    (By the way: you can create CIImages with a solid color directly (CIImage(color:)); no need to create a CGImage first.)

提交回复
热议问题