I have a \"software renderer\" that I am porting from PC to the iPhone. what is the fastest way to manually update the screen with a buffer of pixels on the iphone? for in
Perhaps you could abstract the methods used in the software renderer to a GPU shader... might get better performance. You'd need to send the encoded "video" data as a texture.
The fastest way is to use IOFrameBuffer/IOSurface, which are private frameworks.
So OpenGL seems to be the only possible way for AppStore apps.
A faster method than both CGDataProvider
and glTexSubImage
is to use CVOpenGLESTextureCache
. The CVOpenGLESTextureCache
allows you to directly modify an OpenGL texture in graphics memory without re-uploading.
I used it for a fast animation view you can see here:
https://github.com/justinmeiners/image-sequence-streaming
It is a little tricky to use and I came across it after asking my own question about this topic: How to directly update pixels - with CGImage and direct CGDataProvider
The fastest App Store approved way to do CPU-only 2D graphics is to create a CGImage
backed by a buffer using CGDataProviderCreateDirect
and assign that to a CALayer
's contents
property.
For best results use the kCGImageAlphaPremultipliedFirst | kCGBitmapByteOrder32Little
or kCGImageAlphaNone | kCGBitmapByteOrder32Little
bitmap types and double buffer so that the display is never in an inconsistent state.
edit: this should be faster than drawing to an OpenGL texture in theory, but as always, profile to be sure.
edit2: CADisplayLink
is a useful class no matter which compositing method you use.
Just to post my comment to @rpetrich's answer in the form of an answer, I will say in my tests I found OpenGL to be the fastest way. I've implemented a simple object (UIView subclass) called EEPixelViewer that does this generically enough that it should work for most people I think.
It uses OpenGL to push pixels in a wide variety of formats (24bpp RGB, 32-bit RGBA, and several YpCbCr formats) to the screen as efficiently as possible. The solution achieves 60fps for most pixel formats on almost every single iOS device, including older ones. Usage is super simple and requires no OpenGL knowledge:
pixelViewer.pixelFormat = kCVPixelFormatType_32RGBA;
pixelViewer.sourceImageSize = CGSizeMake(1024, 768);
EEPixelViewerPlane plane;
plane.width = 1024;
plane.height = 768;
plane.data = pixelBuffer;
plane.rowBytes = plane.width * 4;
[pixelViewer displayPixelBufferPlanes: &plane count: 1 withCompletion:nil];
Repeat the displayPixelBufferPlanes call for each frame (which loads the pixel buffer to the GPU using glTexImage2D), and that's pretty much all there is to it. The code is smart in that it tries to use the GPU for any kind of simple processing required such as permuting the color channels, converting YpCbCr to RGB, etc.
There is also quite a bit of logic for honoring scaling using the UIView's contentMode
property, so UIViewContentModeScaleToFit
/Fill, etc. all work as expected.