1
votes

Checking feasibility of eye tracking with ARKit for a new application. We would like to record the point on screen (along with time stamp) that the user looks at, using an iOS device with the True Depth capabilities. I have 2 questions:

  1. Is there some guarantee on the rate that renderer:didUpdate is called. Do we know for example that it is called at least 30 times per second?
  2. In all examples that i saw, using ARKit face tracking requires SceneKit, is there an option to use face tracking without SceneKit?
1

1 Answers

1
votes

First.

It's a pity but there's no guarantee that your app can render a scene at 60 fps, there's also no guarantee that your app can render it at 30 fps. You could set up a rendering frame rate using preferredFramesPerSecond instance property...

var preferredFramesPerSecond: Int { get set }

or:

@IBOutlet var sceneView: ARSCNView!
sceneView.preferredFramesPerSecond = 30

...but it depends on a bunch of factors (especially on how many high-poly models, PBR shaders and shadows your scene has). Hence, you need to choose a frame rate that your app can consistently maintain.

The default value of preferredFramesPerSecond is 0. When this value is 0, the preferred frame rate is equal to the maximum refresh rate of the display, as indicated by the maximumFramesPerSecond property.

Second.

Apart from SceneKit framework, for face tracking you can also use a brand-new framework named RealityKit. But frankly speaking, I haven’t tried yet eye tracking, or so called gaze detection, in context of RealityKit.