core-video

Reading video frame-by-frame under iOS

纵饮孤独 提交于 2019-12-02 17:43:59
I'm looking for a way to retrieve the individual frames of a video using iOS API. I tried using AVAssetImageGenerator but it seems to only provide frame to the nearest second which is a bit too rough for my usage. From what I understand of the documentation, a pipeline of AVAssetReader, AVAssetReaderOutput and CMSampleBufferGetImageBuffer I should be able to do something but I'm stuck with a CVImageBufferRef. With this I'm looking for a way to get a CGImageRef or a UIImage but haven't found it. Real-time is not needed and the more I can stick to provided API the better. Thanks a lot! Edit:

How to get Bytes from CMSampleBufferRef , To Send Over Network

家住魔仙堡 提交于 2019-12-02 14:10:44
Am Captuing video using AVFoundation frame work .With the help of Apple Documentation http://developer.apple.com/library/ios/#documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/03_MediaCapture.html%23//apple_ref/doc/uid/TP40010188-CH5-SW2 Now i did Following things 1.Created videoCaptureDevice 2.Created AVCaptureDeviceInput and set videoCaptureDevice 3.Created AVCaptureVideoDataOutput and implemented Delegate 4.Created AVCaptureSession - set input as AVCaptureDeviceInput and set output as AVCaptureVideoDataOutput 5.In AVCaptureVideoDataOutput Delegate method -(void)captureOutput:

Create a CMSampleBuffer from a CVPixelBuffer

丶灬走出姿态 提交于 2019-12-01 06:25:35
I get a CVPixelBuffer from ARSessionDelegate: func session(_ session: ARSession, didUpdate frame: ARFrame) { frame.capturedImage // CVPixelBufferRef } But another part of my app (that I can't change) uses a CMSampleBuffer. CMSampleBuffer is a container of CVPixelBuffer. In order to create a CMSampleBuffer I can use this function: func CMSampleBufferCreateReadyWithImageBuffer(_ allocator: CFAllocator?, _ imageBuffer: CVImageBuffer, _ formatDescription: CMVideoFormatDescription, _ sampleTiming: UnsafePointer<CMSampleTimingInfo>, _ sBufOut: UnsafeMutablePointer<CMSampleBuffer?>) -> OSStatus The

Create a CMSampleBuffer from a CVPixelBuffer

爱⌒轻易说出口 提交于 2019-12-01 04:53:33
问题 I get a CVPixelBuffer from ARSessionDelegate: func session(_ session: ARSession, didUpdate frame: ARFrame) { frame.capturedImage // CVPixelBufferRef } But another part of my app (that I can't change) uses a CMSampleBuffer. CMSampleBuffer is a container of CVPixelBuffer. In order to create a CMSampleBuffer I can use this function: func CMSampleBufferCreateReadyWithImageBuffer(_ allocator: CFAllocator?, _ imageBuffer: CVImageBuffer, _ formatDescription: CMVideoFormatDescription, _ sampleTiming:

Holding onto a MTLTexture from a CVImageBuffer causes stuttering

拟墨画扇 提交于 2019-11-30 08:37:53
I'm creating a MTLTexture from CVImageBuffer s (from camera and players) using CVMetalTextureCacheCreateTextureFromImage to get a CVMetalTexture and then CVMetalTextureGetTexture to get the MTLTexture . The problem I'm seeing is that when I later render the texture using Metal, I occasionally see video frames rendered out of order (visually it stutters back and forth in time), presumably because CoreVideo is modifying the underlying CVImageBuffer storage and the MTLTexture is just pointing there. Is there any way to make CoreVideo not touch that buffer and use another one from its pool until I

How to directly update pixels - with CGImage and direct CGDataProvider

我与影子孤独终老i 提交于 2019-11-29 22:42:12
Actual Question Several answers will solve my problem: Can I force a CGImage to reload its data from a direct data provider (created with CGDataProviderCreateDirect ) like CGContextDrawImage does? Or is there some other way I can get setting to self.layer.contents to do it? Is there a CGContext configuration, or trick I can use to render 1024x768 images at least 30 fps consistently with CGContextDrawImage . Has anyone been able to successfully use CVOpenGLESTextureCacheCreateTextureFromImage for realtime buffer updates with their own texture data? I think my biggest problem is creating a

Holding onto a MTLTexture from a CVImageBuffer causes stuttering

假装没事ソ 提交于 2019-11-29 11:25:32
问题 I'm creating a MTLTexture from CVImageBuffer s (from camera and players) using CVMetalTextureCacheCreateTextureFromImage to get a CVMetalTexture and then CVMetalTextureGetTexture to get the MTLTexture . The problem I'm seeing is that when I later render the texture using Metal, I occasionally see video frames rendered out of order (visually it stutters back and forth in time), presumably because CoreVideo is modifying the underlying CVImageBuffer storage and the MTLTexture is just pointing

How do I convert a CGImage to CMSampleBufferRef?

天大地大妈咪最大 提交于 2019-11-28 19:18:16
问题 I’d like to convert a CGImage to CMSampleBufferRef and append it to a AVAssetWriterInput using the appendSampleBuffer: method. I’ve managed to get the CMSampleBufferRef using the following code, but the appendSampleBuffer: simply returns NO when I supply the resulting CMSampleBufferRef . What am I doing wrong? - (void) appendCGImage: (CGImageRef) frame { const int width = CGImageGetWidth(frame); const int height = CGImageGetHeight(frame); // Create a dummy pixel buffer to try the encoding //