FaceTracking in ARKit – How to display the “lookAtPoint” on the screen

你离开我真会死。 提交于 2020-06-10 19:21:08

问题


The ARFaceTrackingConfiguration of ARKit places ARFaceAnchor with information about the position and orientation of the face onto the scene. Among others, this anchor has the lookAtPoint property that I'm interested in. I know that this vector is relative to the face. How can I draw a point on the screen for this position, meaning how can I translate this point's coordinates?


回答1:


.lookAtPoint instance property is for direction's estimation only.

Apple documentation says: .lookAtPoint is a position in face coordinate space that is estimating only the gaze of face's direction. It's a vector of three scalar values, and it's just gettable, not settable:

var lookAtPoint: SIMD3<Float> { get }

In other words, this is the resulting vector from the product of two quantities – .rightEyeTransform and .leftEyeTransform instance properties (which also are just gettable):

var rightEyeTransform: simd_float4x4 { get }
var leftEyeTransform: simd_float4x4 { get }

Here's an imaginary situation on how you could use this instance property:

func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {

    if let faceAnchor = anchor as? ARFaceAnchor,
       let faceGeometry = node.geometry as? ARSCNFaceGeometry {

        if (faceAnchor.lookAtPoint.x >= 0) {         // Looking (+X)
            faceGeometry.firstMaterial?.diffuse.contents = UIImage(named: "redTexture.png")
        } else {                                     // Looking (-X)
            faceGeometry.firstMaterial?.diffuse.contents = UIImage(named: "cyanTexture.png")
        }

        faceGeometry.update(from: faceAnchor.geometry)
        facialExrpession(anchor: faceAnchor)

        DispatchQueue.main.async {
            self.label.text = self.textBoard
        }
    }
}

And here's an image showing axis directions for ARFaceTrackingConfiguration():

Answering your question:

I could say that you can't manage this point's coordinates directly because it's gettable-only property (and there is just XYZ orientation, not XYZ translation).

So if you need both – translation and rotation – use .rightEyeTransform and .lefttEyeTransform instance properties instead.

There are two projection methods:

FIRST. In SceneKit/ARKit you need to take the following instance method for projecting a point onto 2D view (for sceneView instance):

func projectPoint(_ point: SCNVector3) -> SCNVector3

or:

let sceneView = ARSCNView()
sceneView.projectPoint(myPoint)

SECOND. In ARKit you need to take the following instance method for projecting a point onto 2D view (for arCamera instance):

func projectPoint(_ point: simd_float3, 
              orientation: UIInterfaceOrientation, 
             viewportSize: CGSize) -> CGPoint

or:

let camera = ARCamera()
camera.projectPoint(myPoint, orientation: myOrientation, viewportSize: vpSize)

This method helps you project a point from the 3D world coordinate system of the scene to the 2D pixel coordinate system of the renderer.

There's also the opposite method (for unprojecting a point):

func unprojectPoint(_ point: SCNVector3) -> SCNVector3

...and ARKit's opposite method (for unprojecting a point):

@nonobjc func unprojectPoint(_ point: CGPoint, 
            ontoPlane planeTransform: simd_float4x4, 
                         orientation: UIInterfaceOrientation, 
                        viewportSize: CGSize) -> simd_float3?


来源:https://stackoverflow.com/questions/55066870/facetracking-in-arkit-how-to-display-the-lookatpoint-on-the-screen

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!