core-video

Apply Core Image Filter to Video on OS X using Swift

早过忘川 提交于 2019-12-08 05:57:42
问题 I am planning to build an NSOpenGLView for an OS X app using SWIFT which can be used to apply Core Image Filter and effects to a video, so far I have worked on the code for the video Controller to add video playback, but I am not sure how to apply the filter to the video: class VideoMediaViewController: NSViewController { weak var mainView : DTMainViewController? @IBOutlet weak var aVPlayerView: AVPlayerView! var url:NSURL?{ didSet{ // this is the setter } } var observer:AnyObject? var player

High-performance copying of RGB pixel data to the screen in iOS

♀尐吖头ヾ 提交于 2019-12-07 23:58:32
问题 Our product contains a kind of software image decoder that essentially produces full-frame pixel data that needs to be rapidly copied the screen (we're running on iOS). Currently we're using CGBitmapContextCreate and we access the memory buffer directly, then for each frame we call CGBitmapContextCreateImage, and then draw that bitmap to the screen. This is WAY too slow for full-screen refreshes on the iPad's retina display at a decent framerate (but it was okay for non-Retina-devices). We've

Using OpenGL ES texture caches instead of glReadPixels to get texture data

喜欢而已 提交于 2019-12-07 05:34:32
问题 In iOS 5, OpenGL ES Texture caches were introduced to provide a direct way from the camera video data to OpenGL without the need of copying the buffers. There was a brief introduction to texture caches in session 414 - Advances in OpenGL ES for iOS 5 of WWDC 2011. I found an interesting article which abuses this concept further in the end and circumvents a call to glReadPixels by simply locking the texture, and then accessing the buffer directly. glReadPixels is really slow due to the tile

Apply Core Image Filter to Video on OS X using Swift

﹥>﹥吖頭↗ 提交于 2019-12-06 14:58:43
I am planning to build an NSOpenGLView for an OS X app using SWIFT which can be used to apply Core Image Filter and effects to a video, so far I have worked on the code for the video Controller to add video playback, but I am not sure how to apply the filter to the video: class VideoMediaViewController: NSViewController { weak var mainView : DTMainViewController? @IBOutlet weak var aVPlayerView: AVPlayerView! var url:NSURL?{ didSet{ // this is the setter } } var observer:AnyObject? var player:AVPlayer? var videoOutput:AVPlayerItemVideoOutput? var ciContext:CIContext? var loadStatus:NSNumber?

High-performance copying of RGB pixel data to the screen in iOS

Deadly 提交于 2019-12-06 04:47:39
Our product contains a kind of software image decoder that essentially produces full-frame pixel data that needs to be rapidly copied the screen (we're running on iOS). Currently we're using CGBitmapContextCreate and we access the memory buffer directly, then for each frame we call CGBitmapContextCreateImage, and then draw that bitmap to the screen. This is WAY too slow for full-screen refreshes on the iPad's retina display at a decent framerate (but it was okay for non-Retina-devices). We've tried all kinds of OpenGL ES-based approaches, including the use of glTexImage2D and glTexSubImage2D

How do I draw onto a CVPixelBufferRef that is planar/ycbcr/420f/yuv/NV12/not rgb?

空扰寡人 提交于 2019-12-06 04:15:10
问题 I have received a CMSampleBufferRef from a system API that contains CVPixelBufferRef s that are not RGBA (linear pixels). The buffer contains planar pixels (such as 420f aka kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange aka yCbCr aka YUV ). I would like to modify do some manipulation of this video data before sending it off to VideoToolkit to be encoded to h264 (drawing some text, overlaying a logo, rotating the image, etc), but I'd like for it to be efficient and real-time. Buuuut planar

CVPixelBuffer to CIImage always returning nil

那年仲夏 提交于 2019-12-06 02:17:29
问题 I am trying to convert a pixelBuffer extracted from AVPlayerItemVideoOutput to CIImage but always getting nil. The Code if([videoOutput_ hasNewPixelBufferForItemTime:player_.internalPlayer.currentItem.currentTime]) { CVPixelBufferRef pixelBuffer = [videoOutput_ copyPixelBufferForItemTime:player_.internalPlayer.currentItem.currentTime itemTimeForDisplay:nil]; CIImage *image = [CIImage imageWithCVPixelBuffer:pixelBuffer]; // Always image === nil CIFilter *filter = [FilterCollection

Using OpenGL ES texture caches instead of glReadPixels to get texture data

最后都变了- 提交于 2019-12-05 12:41:00
In iOS 5, OpenGL ES Texture caches were introduced to provide a direct way from the camera video data to OpenGL without the need of copying the buffers. There was a brief introduction to texture caches in session 414 - Advances in OpenGL ES for iOS 5 of WWDC 2011 . I found an interesting article which abuses this concept further in the end and circumvents a call to glReadPixels by simply locking the texture, and then accessing the buffer directly. glReadPixels is really slow due to the tile-based renderer which is used in iPad 2 (even when you use only 1x1 textures). However, the described

CVPixelBuffer to CIImage always returning nil

我们两清 提交于 2019-12-04 05:15:23
I am trying to convert a pixelBuffer extracted from AVPlayerItemVideoOutput to CIImage but always getting nil. The Code if([videoOutput_ hasNewPixelBufferForItemTime:player_.internalPlayer.currentItem.currentTime]) { CVPixelBufferRef pixelBuffer = [videoOutput_ copyPixelBufferForItemTime:player_.internalPlayer.currentItem.currentTime itemTimeForDisplay:nil]; CIImage *image = [CIImage imageWithCVPixelBuffer:pixelBuffer]; // Always image === nil CIFilter *filter = [FilterCollection filterSepiaForImage:image]; image = filter.outputImage; CIContext *context = [CIContext contextWithOptions:nil];

Reading video frame-by-frame under iOS

只愿长相守 提交于 2019-12-03 05:21:27
问题 I'm looking for a way to retrieve the individual frames of a video using iOS API. I tried using AVAssetImageGenerator but it seems to only provide frame to the nearest second which is a bit too rough for my usage. From what I understand of the documentation, a pipeline of AVAssetReader, AVAssetReaderOutput and CMSampleBufferGetImageBuffer I should be able to do something but I'm stuck with a CVImageBufferRef. With this I'm looking for a way to get a CGImageRef or a UIImage but haven't found