I've got a question about ARKit and ARCore.
I'm developing an App with Unreal Engine in combination with built in Augmented Reality. It's using ARCore for Android Phones and ARKit for iOS-Devices.
Now, what I have observed: I have an virtual world, I can move trough this pretty well. But now for example I lose the tracking ability (for example I'm looking at a white wall or I just put my hand in front of the camera) the Android-Devices loses tracking and orientation. Which means that the whole world is stuck. If I'm doing this with a iPhone I just loose the ability of tracking.
I found something like this which compares ARKit with ARCore.
For ARKit:
Motion Tracking: ARKit can unvaryingly and accurately track device’s positioning in reference with the real objects in the live frame that is captured by the camera using Visual Inertial Odometer (VIO). This allows the devices to capture motion sensor data, recording the real-time position of the device.
For ARCore:
Motion Tracking: ARCore tracks and interprets IMU (Inertial Measurement Unit) data unlike ARKit that goes with VIO. Quite differently it also measures the shape, built and features of the surrounding objects to detect and identify the right position and orientation of the Android device in use.
Source: https://www.itfirms.co/arkit-vs-arcore-how-they-compare-against-each-other/
In the description of the ARKit motion tracking there is nothing about orientation.
Can someone explain me in other, maybe easy to understand, words why ARKit don't loosing the orientation ability in this case?
Thanks in advance