I like the XR Hands visualizer, but I'm wondering if I can just use the mesh from the hands as a trigger when they collide with an object. Is this possible? I tried it without success. I just want logic to trigger if a hand is "touching" an object at any point. I have a feeling that the mesh inside these hands is just for visualization. If so, maybe I could use the bone/joint structure as a collider?
This is the script I want to attach to some simple cube game objects, and give them box colliders and rigid bodies, and have them become children of a hand once touched. If this works, then I could add an if-statement to say, if the hand is in a grip-pose. I want to avoid using the XR Interaction pre-made actions because of some custom functions I have in mind.
using UnityEngine;
public class ChildOnCollision : MonoBehaviour
{
private void OnTriggerEnter(Collider other)
{
// Check if the other object has the tag "Grab"
if (other.CompareTag("Grab"))
{
// Make the other object a child of this object's transform
transform.SetParent(other.transform);
// Optionally, reset the local position and rotation
transform.localPosition = Vector3.zero;
transform.localRotation = Quaternion.identity;
}
}
}
