Checking feasibility of eye tracking with ARKit for a new application. We would like to record the point on screen (along with time stamp) that the user looks at, using an iOS device with the True Depth capabilities. I have 2 questions:
- Is there some guarantee on the rate that
renderer:didUpdate
is called. Do we know for example that it is called at least 30 times per second? - In all examples that i saw, using ARKit face tracking requires SceneKit, is there an option to use face tracking without SceneKit?