I've been trying to figure this out for a few days now.
Given an ARKit-based app where I track a user's face, how can I get the face's rotation in absolute terms, from its anchor?
I can get the transform of the ARAnchor, which is a simd_matrix4x4. There's a lot of info on how to get the position out of that matrix (it's the 3rd column), but nothing on the rotation!
I want to be able to control a 3D object outside of the app, by passing YAW, PITCH and ROLL.
The latest I thing I tried actually works somewhat:
let arFrame = session.currentFrame!
guard let faceAnchor = arFrame.anchors[0] as? ARFaceAnchor else { return }
let faceMatrix = SCNMatrix4.init(faceAnchor.transform)
let node = SCNNode()
node.transform = faceMatrix
let rotation = node.worldOrientation
rotation.x .y and .z have values I could use, but as I move my phone the values change. For instance, if I turn 180˚ and keep looking at the phone, the values change wildly based on the position of the phone.
I tried changing the world alignment in the ARConfiguration, but that didn't make a difference.
Am I reading the wrong parameters? This should have been a lot easier!