LiDAR and RealityKit – Capture a Real World Texture for a Scanned Model

后端 未结 2 1973
有刺的猬
有刺的猬 2020-12-28 16:33

Task

I would like to capture a real-world texture and apply it to a 3D mesh produced with a help of LiDAR scanner. I suppose that Proj

相关标签:
2条回答
  • 2020-12-28 17:06

    Here is a preliminary solution (it's not a final one):

    import MetalKit
    import ARKit
    
    /*  Color model YCbCr  */
    var capturedTextureChannelY: CVMetalTexture?      /*  Luma               */
    var capturedTextureChannelCbCr: CVMetalTexture?   /*  Chroma difference  */
    
    lazy var rgbUniforms: RGBUniforms = {
        var uniforms = RGBUniforms()
        uniforms.radius = rgbRadius
        uniforms.viewToCamera.copy(from: viewToCamera)
        uniforms.viewRatio = Float(viewportSize.width / viewportSize.height)
        return uniforms
    }()
    
    func updateTextures(frame: ARFrame) {
        let pixelBuffer = frame.capturedImage
        guard CVPixelBufferGetPlaneCount(pixelBuffer) >= 2 else { return }  
      
        capturedTextureChannelY = makeTexture(fromPixelBuffer: pixelBuffer, 
                                                  pixelFormat: .r8Unorm, 
                                                   planeIndex: 0)
        capturedTextureChannelCbCr = makeTexture(fromPixelBuffer: pixelBuffer, 
                                                     pixelFormat: .rg8Unorm, 
                                                      planeIndex: 1)
    }
    
    func makeTexture(fromPixelBuffer pixelBuffer: CVPixelBuffer, 
                                     pixelFormat: MTLPixelFormat, 
                                      planeIndex: Int) -> CVMetalTexture? {
    
        let width = CVPixelBufferGetWidthOfPlane(pixelBuffer, planeIndex)
        let height = CVPixelBufferGetHeightOfPlane(pixelBuffer, planeIndex)
        
        var texture: CVMetalTexture? = nil
        let status = CVMetalTextureCacheCreateTextureFromImage(nil, 
                                                               textureCache, 
                                                               pixelBuffer, 
                                                               nil, 
                                                               pixelFormat, 
                                                               width, 
                                                               height, 
                                                               planeIndex, 
                                                               &texture)
          
        if status != kCVReturnSuccess {
            texture = nil
        }
        return texture
    }
    
    func draw() {
        guard let currentFrame = session.currentFrame,
              let commandBuffer = commandQueue.makeCommandBuffer(),
              let renderDescriptor = renderDestination.currentRenderPassDescriptor,
              let renderEncoder = commandBuffer.makeRenderCommandEncoder(descriptor: renderDescriptor)
        else { return }
    
        self.updateTextures(frame: currentFrame)
        
        if rgbUniforms.radius > 0 {
            var retainingTextures = [capturedTextureChannelY, 
                                     capturedTextureChannelCbCr]
    
            commandBuffer.addCompletedHandler { buffer in
                retainingTextures.removeAll()
            }
    
            renderEncoder.setFragmentTexture(CVMetalTextureGetTexture(capturedTextureChannelY!), 
                                                                      index: Int(kTextureY.rawValue))
            renderEncoder.setFragmentTexture(CVMetalTextureGetTexture(capturedTextureChannelCbCr!), 
                                                                      index: Int(kTextureCbCr.rawValue))
            renderEncoder.drawPrimitives(type: .triangleStrip, vertexStart: 0, vertexCount: 4)
        }
    }
    

    P.S.

    I found the post called LiDAR equipped for 3D modelling on Apple Developer Forum. It says:

    Question:

    Can Camera and LiDAR sensor work together to achieve a 3D model with texture?

    Answer:

    Yes that is (partially) possible. You can project any geometry of an anchor back into the camera image to reason about the texture. However this requires multiple viewpoints and some form of higher-level logic to decide with projection to apply to which part of your geometry.

    Frameworks Engineer

    0 讨论(0)
  • 2020-12-28 17:13

    How it can be done in Unity

    I'd like to share some interesting info about the work of Unity's AR Foundation with a mesh coming from LiDAR. At the moment – November 01, 2020 – there's an absurd situation. It's associated with the fact that native ARKit developers cannot capture the texture of a scanned object using standard high-level RealityKit tools, however Unity's AR Foundation users (creating ARKit apps) can do this using the ARMeshManager script. I don't know whether this script was developed by AR Foundation team or just by developers of a small creative startup (and then subsequently bought), but the fact remains.

    To use ARKit meshing with AR Foundation, you just need to add the ARMeshManager component to your scene. As you can see on the picture there are such features as Texture Coordinates, Color and Mesh Density.

    If anyone has more detailed information on how this must be configured or scripted in Unity, please post about it in this thread.

    0 讨论(0)
提交回复
热议问题