Possibly related to How can I simulate hand rays on HoloLens 1?
I want to use HoloLens 1 devices to simulate basic near interactions as provided by HoloLens 2.
Specifically, how can I perform the following mappings:
- Use hand position during "Ready" gesture to control PokePointer?
- Use hand position during "Tap-and-hold" gesture to control GrabPointer?
Since HL1 does not track hand orientation, I expect these need to be estimated similar to the example with hand rays.
I have tried creating a custom pointer per the answer above, and it works for hand rays but not for poke/grab as far as I can tell.
I've also created a custom poke pointer according to the example for WMR controllers at How to mimic HoloLens 2 hand tracking wIth Windows Mixed Reality controllers [MRTK2]?, and assigned it to the GGV controller in the same fashion, but somehow the hands don't seem to get detected for poke (or grab), only for hand rays.
(I'm using the Grab pose since HL1 does not seem to return index finger pose during Ready gesture, and since pointer pose seems to refer to the gaze pointer for HL1)