How to use AVMutableComposition and CALayers on iOS

我与影子孤独终老i 提交于 2019-12-07 17:18:32

问题


I'm planning to render content in a view on iOS using AV mutable composition. I want to combine the video coming from one of the iPhone cameras with content created in a layer - mutable composition seems to fit the bill here as it can composite layers into the video content.

It's not critical that the compositing be done as video is being recorded - I'm also happy to mix the required data into a composition that is then rendered (via AVExportSession) to a file after initial video recording has been completed.

What I don't get though is how a [ca]layer is supposed to know what to draw at a given time during the composition, in the context of the AV framework.

My layer content is dependent on a timeline, the timeline describes what needs to be drawn within the layer. So if I embed a layer into the mutable composition and then export that composition via AVExportSession - how will the CALayer instance know what time its supposed to produce content for?


回答1:


I've had similar thing going on. I would recommend you to check around the WWDC 2010 AVEditDemo application source. There is an example code there which does exactly what you need - placing a CALayer on top of a video track and also doing an animation on top of it.

You can also check my efforts on the subject at: Mix video with static image in CALayer using AVVideoCompositionCoreAnimationTool



来源:https://stackoverflow.com/questions/6206839/how-to-use-avmutablecomposition-and-calayers-on-ios

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!