Best path from AVPlayerItemVideoOutput to openGL Texture

前端 未结 3 1909
醉话见心
醉话见心 2021-02-06 05:51

Been pulling my hair out trying to figure out the current best path from AVFoundation videos to an openGLTexture, most of what I find is related to iOS, and I can\'t seem to mak

3条回答
  •  既然无缘
    2021-02-06 06:14

    It appears that the corresponding method of fast video texturing on OSX is to use BiPlanar IOSurfaces where surfacePlane[0] is the luma (Y) and surfacePlane[1] is the subsampled chroma (UV). My code runs on Core Profile, so the GLenum constants to CGLTexImageIOSurface2D reflect the same. Pretty sure that only rectangle textures are supported. I use a GLSL shader to combine them and it's working great on Sierra. Briefly summarized:

    NSDictionary* pixelBufferAttributesIOSurfaceBiPlanar = [NSDictionary dictionaryWithObjectsAndKeys:
    [NSNumber numberWithUnsignedInt:kPixelFormatTypeGL], ARC_BRIDGE(id)kCVPixelBufferPixelFormatTypeKey,
    [NSNumber numberWithBool:YES], ARC_BRIDGE(id)kCVPixelBufferOpenGLCompatibilityKey,
    [NSNumber numberWithBool:YES], ARC_BRIDGE(id)kCVPixelBufferIOSurfaceOpenGLTextureCompatibilityKey,
    [NSDictionary dictionary], ARC_BRIDGE(id)kCVPixelBufferIOSurfacePropertiesKey,
    nil];
    
    /* luma texture */
    CGLTexImageIOSurface2D(glContext,GL_TEXTURE_RECTANGLE,GL_R8,(GLsizei)textureSize.width,(GLsizei)textureSize.height,GL_RED,GL_UNSIGNED_BYTE,surfaceRef,0);
    /* chroma texture */
    CGLTexImageIOSurface2D(glContext,GL_TEXTURE_RECTANGLE,GL_RG8,(GLsizei)textureSize.width,(GLsizei)textureSize.height,GL_RG,GL_UNSIGNED_BYTE,surfaceRef,1);
    

    GLSL

    uniform sampler2DRect textureSampler0;
    uniform sampler2DRect textureSampler1;
    // ...
    vec3 YCrCb;
    vec2 lumaCoords = texCoord0;
    vec2 chromaCoords = lumaCoords*vec2(0.5,0.5);
    vec2 chroma = texture(textureSampler1,chromaCoords).xy;
    float luma = texture(textureSampler0,lumaCoords).x;
    YCrCb.x = (luma-(16.0/255.0)); // video range
    YCrCb.yz = (chroma-vec2(0.5,0.5));
    vec4 rgbA = vec4(colorConversionMatrix * YCrCb,1.0);
    

    The color conversion matrix should be generated from the CVPixelBufferRef

    CFTypeRef colorAttachments = CVBufferGetAttachment(pixelBuffer, kCVImageBufferYCbCrMatrixKey, NULL);

    IOSurfaceRef is a CFTypeRef derived object and I use CFRetain/CFRelease to hold onto the surface until I don't need the texture. If CGLTexImageIOSurface2D does a GPU blit to upload the texture data, probably only need to CFRetain/Release around the call to CGLTexImageIOSurface2D.

    I started with the Apple iOS sample code AVBasicVideoOutput. I ported it to Metal for both iOS & OSX in two days and then spent a week trying to figure out the OpenGL Core Profile version. If you can, use Metal: it's faster and the code is almost exactly the same on both iOS/OSX. Also, there is some good information in WWDC13 Session 507 What's New in OpenGL for OS X. In particular, that there is a GLSL shader compatibility flag that allows EAGL shaders to run mostly unmodified on OSX.

提交回复
热议问题