fastest way to draw a screen buffer on the iphone

后端 未结 5 1901
终归单人心
终归单人心 2020-12-12 21:35

I have a \"software renderer\" that I am porting from PC to the iPhone. what is the fastest way to manually update the screen with a buffer of pixels on the iphone? for in

相关标签:
5条回答
  • 2020-12-12 21:46

    Perhaps you could abstract the methods used in the software renderer to a GPU shader... might get better performance. You'd need to send the encoded "video" data as a texture.

    0 讨论(0)
  • 2020-12-12 21:49

    The fastest way is to use IOFrameBuffer/IOSurface, which are private frameworks.

    So OpenGL seems to be the only possible way for AppStore apps.

    0 讨论(0)
  • 2020-12-12 21:51

    A faster method than both CGDataProvider and glTexSubImage is to use CVOpenGLESTextureCache. The CVOpenGLESTextureCache allows you to directly modify an OpenGL texture in graphics memory without re-uploading.

    I used it for a fast animation view you can see here:

    https://github.com/justinmeiners/image-sequence-streaming

    It is a little tricky to use and I came across it after asking my own question about this topic: How to directly update pixels - with CGImage and direct CGDataProvider

    0 讨论(0)
  • 2020-12-12 21:58

    The fastest App Store approved way to do CPU-only 2D graphics is to create a CGImage backed by a buffer using CGDataProviderCreateDirect and assign that to a CALayer's contents property.

    For best results use the kCGImageAlphaPremultipliedFirst | kCGBitmapByteOrder32Little or kCGImageAlphaNone | kCGBitmapByteOrder32Little bitmap types and double buffer so that the display is never in an inconsistent state.

    edit: this should be faster than drawing to an OpenGL texture in theory, but as always, profile to be sure.

    edit2: CADisplayLink is a useful class no matter which compositing method you use.

    0 讨论(0)
  • 2020-12-12 22:07

    Just to post my comment to @rpetrich's answer in the form of an answer, I will say in my tests I found OpenGL to be the fastest way. I've implemented a simple object (UIView subclass) called EEPixelViewer that does this generically enough that it should work for most people I think.

    It uses OpenGL to push pixels in a wide variety of formats (24bpp RGB, 32-bit RGBA, and several YpCbCr formats) to the screen as efficiently as possible. The solution achieves 60fps for most pixel formats on almost every single iOS device, including older ones. Usage is super simple and requires no OpenGL knowledge:

    pixelViewer.pixelFormat = kCVPixelFormatType_32RGBA;
    pixelViewer.sourceImageSize = CGSizeMake(1024, 768);
    EEPixelViewerPlane plane;
    plane.width = 1024;
    plane.height = 768;
    plane.data = pixelBuffer;
    plane.rowBytes = plane.width * 4;
    [pixelViewer displayPixelBufferPlanes: &plane count: 1 withCompletion:nil];
    

    Repeat the displayPixelBufferPlanes call for each frame (which loads the pixel buffer to the GPU using glTexImage2D), and that's pretty much all there is to it. The code is smart in that it tries to use the GPU for any kind of simple processing required such as permuting the color channels, converting YpCbCr to RGB, etc.

    There is also quite a bit of logic for honoring scaling using the UIView's contentMode property, so UIViewContentModeScaleToFit/Fill, etc. all work as expected.

    0 讨论(0)
提交回复
热议问题