I want to use Core Image for processing a bunch of CGImage
objects and turning them into a QuickTime movie on macOS. The following code demonst
For your use case it would be better to use the pull-style APIs of AVAssetWriterInput
because you don't need to process any media in real-time (like you would when capturing from a camera).
So rather then pausing the thread when the input isn't ready, just wait for it to pull the next frame. Remember to also set expectsMediaDataInRealTime
to false
in this case.
I think the main problem with you current approach is that you pause the very thread that the video processing is happening in when the writer is not yet ready.
(By the way: you can create CIImage
s with a solid color directly (CIImage(color:)
); no need to create a CGImage
first.)