youtube平台上的手势识别的视频教程
个人总结教程:
1,导入XRHands和XRInteractionToolkit插件(我使用Pico开发,先导入了Pico插件)
2,打包HandGestures场景
3,新建手势形状
4,在VR设备里比划需要识别的手势,插件目前只支持静态手势,保持手势不动,这时观察记录UI面板上五个手指对应的值。
5,将记录的五根手指对应的值添加填入,越详细越准确。比如下图小手指Targets我加了两个。可以适当的移动旋转手势,设置比较合理的区间范围。
6,在空物体上挂StaticHandGesture脚本,对应的解释如下图
(1)插件没有手势保持事件,自己加一下,代码放在最后了
(2)想用Pico提供的手势模型
。将手的预制体放到此节点
。将手节点的名字按照插件手节点的名字进行重命名
。将R_Wrist节点拖到RootTransform,点击FindJoints
增加手势保持触发事件
using UnityEngine.Events;
using UnityEngine.UI;
using UnityEngine.XR.Hands.Gestures;
namespace UnityEngine.XR.Hands.Samples.GestureSample
{
/// <summary>
/// A gesture that detects when a hand is held in a static shape and orientation for a minimum amount of time.
/// </summary>
public class StaticHandGesture : MonoBehaviour
{
[SerializeField]
[Tooltip("The hand tracking events component to subscribe to receive updated joint data to be used for gesture detection.")]
XRHandTrackingEvents m_HandTrackingEvents;
[SerializeField]
[Tooltip("The hand shape or pose that must be detected for the gesture to be performed.")]
ScriptableObject m_HandShapeOrPose;
[SerializeField]
[Tooltip("The target Transform to user for target conditions in the hand shape or pose.")]
Transform m_TargetTransform;
//[SerializeField]
//[Tooltip("The image component that draws the background for gesture icons.")]