How do you play a video with alpha channel using AVFoundation?

前端 未结 1 1998
一个人的身影
一个人的身影 2021-02-09 08:57

I have an AR application which uses SceneKit, and imports a video on to scene using AVPlayer and thereby adding it as a child node of an SKVideo<

相关标签:
1条回答
  • 2021-02-09 09:37

    I've came up with two ways of making this possible. Both utilize surface shader modifiers. Detailed information on shader modifiers can be found in Apple Developer Documentation.

    Here's an example project I've created.


    1. Masking

    1. You would need to create another video that represents a transparency mask. In that video black = fully opaque, white = fully transparent (or any other way you would like to represent transparency, you would just need to tinker the surface shader).

    2. Create a SKScene with this video just like you do in the code you provided and put it into material.transparent.contents (the same material that you put diffuse video contents into)

      let spriteKitOpaqueScene = SKScene(...)
      let spriteKitMaskScene = SKScene(...)
      ... // creating SKVideoNodes and AVPlayers for each video etc
      
      let material = SCNMaterial()
      material.diffuse.contents = spriteKitOpaqueScene
      material.transparent.contents = spriteKitMaskScene
      
      let background = SCNPlane(...)
      background.materials = [material]
      
    3. Add a surface shader modifier to the material. It is going to "convert" black color from the mask video (well, actually red color, since we only need one color component) into alpha.

      let surfaceShader = "_surface.transparent.a = 1 - _surface.transparent.r;"
      material.shaderModifiers = [ .surface: surfaceShader ]
      

    That's it! Now the white color on the masking video is going to be transparent on the plane.

    However you would have to take extra care of syncronizing these two videos since AVPlayers will probably get out of sync. Sadly I didn't have time to address that in my example project (yet, I will get back to it when I have time). Look into this question for a possible solution.

    Pros:

    • No artifacts (if syncronized)
    • Precise

    Cons:

    • Requires two videos instead of one
    • Requires synchronisation of the AVPlayers

    2. Chroma keying

    1. You would need a video that has a vibrant color as a background that would represent parts that should be transparent. Usually green or magenta are used.

    2. Create a SKScene for this video like you normally would and put it into material.diffuse.contents.

    3. Add a chroma key surface shader modifier which will cut out the color of your choice and make these areas transparent. I've lent this shader from GPUImage and I don't really know how it actually works. But it seems to be explained in this answer.

       let surfaceShader =
      """
      uniform vec3 c_colorToReplace = vec3(0, 1, 0);
      uniform float c_thresholdSensitivity = 0.05;
      uniform float c_smoothing = 0.0;
      
      #pragma transparent
      #pragma body
      
      vec3 textureColor = _surface.diffuse.rgb;
      
      float maskY = 0.2989 * c_colorToReplace.r + 0.5866 * c_colorToReplace.g + 0.1145 * c_colorToReplace.b;
      float maskCr = 0.7132 * (c_colorToReplace.r - maskY);
      float maskCb = 0.5647 * (c_colorToReplace.b - maskY);
      
      float Y = 0.2989 * textureColor.r + 0.5866 * textureColor.g + 0.1145 * textureColor.b;
      float Cr = 0.7132 * (textureColor.r - Y);
      float Cb = 0.5647 * (textureColor.b - Y);
      
      float blendValue = smoothstep(c_thresholdSensitivity, c_thresholdSensitivity + c_smoothing, distance(vec2(Cr, Cb), vec2(maskCr, maskCb)));
      
      float a = blendValue;
      _surface.transparent.a = a;
      """
      
      shaderModifiers = [ .surface: surfaceShader ]
      

      To set uniforms use setValue(:forKey:) method.

      let vector = SCNVector3(x: 0, y: 1, z: 0) // represents float RGB components
      setValue(vector, forKey: "c_colorToReplace")
      setValue(0.3 as Float, forKey: "c_smoothing")
      setValue(0.1 as Float, forKey: "c_thresholdSensitivity")
      

      The as Float part is important, otherwise Swift is going to cast the value as Double and shader will not be able to use it.

      But to get a precise masking from this you would have to really tinker with the c_smoothing and c_thresholdSensitivity uniforms. In my example project I ended up having a little green rim around the shape, but maybe I just didn't use the right values.

    Pros:

    • only one video required
    • simple setup

    Cons:

    • possible artifacts (green rim around the border)
    0 讨论(0)
提交回复
热议问题