ARKit & Reality composer - how to Anchor scene using image coordinates

后端 未结 1 1594
猫巷女王i
猫巷女王i 2021-01-24 04:55

I have written code to initialise one of 3 Reality Composer scenes when a button is pressed depending on the day of the month.

That all works fine.

The Reality C

1条回答
  •  一向
    一向 (楼主)
    2021-01-24 05:39

    What is happening is that when the image anchor goes out of view the AnchorEntity becomes unanchored and RealityKit will then stop rendering it and all its descendants.

    One way to work around this could be to just separate your image anchor and content you want to render, add the image anchor manually in code, then when the image anchor is first detected, add your content to the scene under a different world anchor. When the image anchor transform is updated, update your world anchor to match.

    That way you can use the image anchor when it is visible to get the latest transform, but when it disappears the rendering of the content is not tied to it. Something like below (you will have to create an AR Resource Group called ARTest and add an image to it named "test" to get the anchor to work):

    import ARKit
    import SwiftUI
    import RealityKit
    import Combine
    
    struct ContentView : View {
        var body: some View {
            return ARViewContainer().edgesIgnoringSafeArea(.all)
        }
    }
    
    let arDelegate = SessionDelegate()
    
    struct ARViewContainer: UIViewRepresentable {
    
      func makeUIView(context: Context) -> ARView {
    
        let arView = ARView(frame: .zero)
    
        arDelegate.set(arView: arView)
        arView.session.delegate = arDelegate
    
        // Create an image anchor, add it to the scene. We won't add any
        // rendering content to the anchor, it will be used only for detection
        let imageAnchor = AnchorEntity(.image(group: "ARTest", name: "test"))
        arView.scene.anchors.append(imageAnchor)
    
        return arView
      }
    
      func updateUIView(_ uiView: ARView, context: Context) {}
    }
    
    final class SessionDelegate: NSObject, ARSessionDelegate {
      var arView: ARView!
      var rootAnchor: AnchorEntity?
    
      func set(arView: ARView) {
        self.arView = arView
      }
    
      func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
    
        // If we already added the content to render, ignore
        if rootAnchor != nil {
           return
        }
    
        // Make sure we are adding to an image anchor. Assuming only
        // one image anchor in the scene for brevity.
        guard anchors[0] is ARImageAnchor else {
          return
        }
    
        // Create the entity to render, could load from your experience file here
        // this will render at the center of the matched image
        rootAnchor = AnchorEntity(world: [0,0,0])
        let ball = ModelEntity(
          mesh: MeshResource.generateBox(size: 0.01),
          materials: [SimpleMaterial(color: .red, isMetallic: false)]
        )
        rootAnchor!.addChild(ball)
    
        // Just add another model to show how it remains in the scene even
        // when the tracking image is out of view.
        let ball2 = ModelEntity(
          mesh: MeshResource.generateBox(size: 0.10),
          materials: [SimpleMaterial(color: .orange, isMetallic: false)]
        )
        ball.addChild(ball2)
        ball2.position = [0, 0, 1]
    
        arView.scene.addAnchor(rootAnchor!)
      }
    
      func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
        guard let rootAnchor = rootAnchor else {
          return
        }
    
        // Code is assuming you only have one image anchor for brevity
        guard let imageAnchor = anchors[0] as? ARImageAnchor else {
          return
        }
    
        if !imageAnchor.isTracked {
          return
        }
    
        // Update our fixed anchor to image transform
        rootAnchor.transform = Transform(matrix: imageAnchor.transform)
      }
    
    }
    
    #if DEBUG
    struct ContentView_Previews : PreviewProvider {
      static var previews: some View {
        ContentView()
      }
    }
    #endif
    
    

    NOTE: It seems like the transform for an ARImageAnchor updates frequently as you move around as ARKit is trying to calculate the accurate image plane (content might seem like it is in the right place but the z value is not accurate for example), make sure your image dimensions are accurate in the AR resource group for the image to get better tracking.

    0 讨论(0)
提交回复
热议问题