0
votes

I have been working with Unity 2020.3 LTS, the Windows XR Plugin, and the amazing MRTK 2.7.0 to port an existing application to HoloLens 2.

In this application I have a scene with several GameObjects in it and I need to detect whether a hand touches a GameObject (either with the indexfingertip near interaction or the pinch gesture far interaction). The important part here is that this detection needs to happen in a central script in the scene (i.e. maybe have the hand as an object in the code) and not from the view of the touched Gameobject itself.

I have successfully implemented the latter using this example with the two code examples below on that page, but the touched GameObject itself firing events via a listener does not work well with my use case. I need to detect the touch from the hand's perspective, so to speak.

I have searched the web and the Microsoft MRTK documentation several times for this and unfortunately I could not find anything remotely helpful. For head-gaze the documentation has a super simple code example that works beautifully: Head-gaze in Unity. I need the exact same thing for detecting when a hand touches a GameObject.

Eventually I will also need the same thing for eye-tracking when looking at a GameObject, but I have not looked into this yet and right now the hand interaction is giving me headaches. I hope someone can help me with this. Thanks in advance :).

2

2 Answers

0
votes

but the touched GameObject itself firing events via a listener does not work well with my use case.

Why does the event not work? Could you provide more detail about it?

In addition to NearInteractionTouchable, have you tried the Interactable component? It's usually used to attach to the touched Game Object and will fire the event receiver when catching input actions. In the event receiver (in the Component UI), you can add any function attached to any object as the listener, such as a central script in the scene. It should be an effortless way can meet your request. For more information please see: Event

0
votes

After some additional fiddling around I was able to get it to work the way I want/need to with the Touch Code Example. The solution was to create an empty GameObject variable in the code of the central script that is continuously checked whether it is null or not. The touch on the GameObject itself then binds itself to that checked GameObject variable as long as it is touched and sets it back to null once it is not touched anymore. This allows the central script to work with the touched GameObject as long as it is touched.

    void Start()
    {
        centralScript = GameObject.Find("Scripts").GetComponent<CentralScript>();

        NearInteractionTouchableVolume touchable = gameObject.AddComponent<NearInteractionTouchableVolume>();
        touchable.EventsToReceive = TouchableEventType.Pointer;
        pointerHandler = gameObject.AddComponent<PointerHandler>();

        pointerHandler.OnPointerDown.AddListener((e) =>
        {
        centralScript.handTouchGameObject = gameObject;
        });

    pointerHandler.OnPointerUp.AddListener((e) =>
        {
        centralScript.handTouchGameObject = null;
        });
    }