问题
I'm looking for a way to retrieve the individual frames of a video using iOS API. I tried using AVAssetImageGenerator but it seems to only provide frame to the nearest second which is a bit too rough for my usage.
From what I understand of the documentation, a pipeline of AVAssetReader, AVAssetReaderOutput and CMSampleBufferGetImageBuffer I should be able to do something but I'm stuck with a CVImageBufferRef. With this I'm looking for a way to get a CGImageRef or a UIImage but haven't found it.
Real-time is not needed and the more I can stick to provided API the better.
Thanks a lot!
Edit:
Based on this site: http://www.7twenty7.com/blog/2010/11/video-processing-with-av-foundation and this question: how to convert a CVImageBufferRef to UIImage I'm nearing on a solution. Problem, the AVAssetReader stops reading after the first copyNextSampleBuffer
without giving me anything (the sampleBuffer is NULL).
The video is readable by MPMoviePlayerController. I don't understand what's wrong.
回答1:
The two links above actually answer my question and the empty copyNextBufferSample
is an issue with iOS SDK 5.0b3, it works on the device.
回答2:
AVAssetImageGenerator
has very loose default tolerances for the exact frame time that is grabbed. It has two properties that determine the tolerance: requestedTimeToleranceBefore
and requestedTimeToleranceAfter
. These tolerances default to kCMTimePositiveInfinity
, so if you want exact times, set them to kCMTimeZero
to get exact frames.
(It may take longer to grab the exact frames than approximate frames, but you state that realtime is not an issue.)
回答3:
Use AVReaderWriter. Though its an OS X Apple sample code, AVFoundation is available on both platforms with little changes.
来源:https://stackoverflow.com/questions/6783214/reading-video-frame-by-frame-under-ios