releases/2.5.4releases/2.0.0releases/2.1.0releases/2.2.0releases/2.3.0releases/2.4.0releases/2.5.0releases/2.5.1releases/2.5.2releases/2.5.3
  • Features and Architecture
  • API Documentation

We've moved!

Starting from MRTK 2.6, we are publishing both conceptual docs and API references on docs.microsoft.com. For conceptual docs, please visit our new landing page. For API references, please visit the MRTK-Unity section of the dot net API explorer. Existing content will remain here but will not be updated further.

  • Features and Architecture
  • Feature Overviews
  • Input System
  • Hands
Search Results for

    Show / Hide Table of Contents
    • Welcome to MRTK
      • Installation Guide
      • Configuration
        • Using the Unity Package Manager
        • MRTK configuration dialog
        • Getting started with MRTK and XR SDK
      • Updates and Deployment
        • Updating from earlier versions
        • Upgrading from HTK
        • Building and Deploying MRTK
      • Packages and Release Notes
        • Release Notes
        • MRTK Packages
      • Performance and Best Practices
        • Performance
        • Hologram Stabilization
        • Using MRTK in large projects
    • Architecture
      • Overview
      • Framework and Runtime
      • Input System
        • Terminology
        • Core System
        • Controllers, Pointers, and Focus
      • Systems, Extension Services and Data Providers
    • Feature Overviews
      • Boundary System
        • Boundary System Overview
        • Configuring the Boundary Visualization
      • Camera System
        • Camera System Overview
        • Camera Settings Providers
          • Windows Mixed Reality Camera Settings
          • Unity AR Camera Settings [Experimental]
          • Creating a camera settings provider
      • Cross Platform Support
        • Configure MRTK for iOS and Android
        • Configure MRTK for Leap Motion Hand Tracking
        • Configure MRTK for Oculus Quest
      • Detecting Platform Capabilities
      • Diagnostics System
        • Diagnostics System Overview
        • Configuring the Diagnostics System
        • Using the Visual Profiler
      • Extension Services
        • Extension Service Creation Wizard
        • Scene Transition Service Overview
        • Hand Physics Service Overview
      • Input System
        • Input Overview
        • Input Actions
        • Input Events
        • Input Providers
          • Input Providers Overview
          • Creating an input data provider
        • Controllers
        • Eyes
          • Overview
          • Getting Started
          • Access Data via Code
          • Validate Tracking Calibration
        • Gaze
        • Gestures
        • Hands
        • How to Add Near Interaction
        • In-Editor Input Simulation
        • Pointers
        • Voice Input
          • Dictation
          • Speech (Command and Control)
      • Multi Scene System
        • Multi Scene System Overview
        • Scene Types
        • Content Scene Loading
        • Monitoring Content Loading
        • Lighting Scene Operations
      • Packaging
        • MRTK Packages
        • MRTK Modularization
      • Profiles
        • Profiles Overview
        • Configuration Guide
      • Rendering
        • MRTK Standard Shader
        • Material Instance Overview
        • Hover Light Overview
        • Proximity Light Overview
        • Clipping Primitive Overview
      • Services
        • What makes a mixed reality feature
        • What are the MixedRealityServiceRegistry and IMixedRealityServiceRegistrar
        • Extension services
      • Spatial Awareness System
        • Spatial Awareness Overview
        • Spatial Observers
          • Configuring Observers for Device
          • Configuring Observers for Editor
          • Controlling Observers via Code
          • Creating a custom Observer
      • Teleport System Overview
      • Tools
        • Dependency Window
        • Extension Service Creation Wizard
        • Holographic Remoting
        • Input Animation Recording
          • Input Animation File Format Specification
        • Migration Window
        • Optimize Window
        • Runtime tools
          • Controller Mapping tool
          • InputFeatureUsage tool
          • MixedRealityInteractionMapping tool
      • UX Building Blocks
        • Toolbox Window
        • Button
        • Bounds Control
        • Object Manipulator
        • Constraint Manager
        • Slate
        • System Keyboard
        • Interactable
        • Interactive Element
        • Solvers
          • Tap to Place
        • Object Collection
        • Scrolling Object Collection
        • Tooltips
        • Slider
        • Hand Menu
        • Near Menu
        • App Bar
        • Rigged Hand Visualizer
        • Fingertip Visualization
        • Progress Indicator
        • Dialog
        • Hand Coach
        • Pulse Shader
        • Dock Control [Experimental]
        • HoloLens Keyboard Helpers [Experimental]
        • Elastic System [Experimental]
        • Bounding Box [Obsolete]
        • Manipulation Handler [Obsolete]
      • Example Scenes
        • Examples Hub
        • Hand Interaction Example
        • Eye Tracking Interaction Example
    • Contributing
      • Contributing Overview
      • Coding Guidelines
      • Writing and Running Tests
      • Writing Documentation
      • Pull Requests
      • Experimental Features
      • Breaking Changes
      • How to use DocFX
    • Planning
      • Roadmap
    • Notice
    • Authors

    Hand tracking

    Hand tracking profile

    The Hand Tracking profile is found under the Input System profile. It contains settings for customizing hand representation.

    Joint prefabs

    Joint prefabs are visualized using simple prefabs. The Palm and Index Finger joints are of special importance and have their own prefab, while all other joints share the same prefab.

    By default the hand joint prefabs are simple geometric primitives. These can be replaced if desired. If no prefab is specified at all, empty GameObjects are created instead.

    Warning

    Avoid using complex scripts or expensive rendering in joint prefabs, since joint objects are transformed on every frame and can have significant performance cost!

    Default Hand Joint Representation Joint Labels

    Hand mesh prefab

    The hand mesh is used if fully defined mesh data is provided by the hand tracking device. The mesh renderable in the prefab is replaced by data from the device, so a dummy mesh such as a cube is sufficient. The material of the prefab is used for the hand mesh.

    Hand mesh display can have a noticeable performance impact, for this reason it can be disabled entirely by unchecking Enable Hand Mesh Visualization option.

    Hand visualization settings

    The hand mesh and hand joint visualizations can be turned off or on via the Hand Mesh Visualization Modes setting and Hand Joint Visualization Modes respectively. These settings are application-mode specific, meaning it is possible to turn on some features while in editor (to see joints with in-editor simulation, for example) while having the same features turned off when deployed to device (in player builds).

    Note that it's generally recommended to have hand joint visualization turned on in editor (so that in-editor simulation will show where the hand joints are), and to have both hand joint visualization and hand mesh visualization turned off in player (because they incur a performance hit).

    Scripting

    Position and rotation can be requested from the input system for each individual hand joint as a MixedRealityPose.

    Alternatively the system allows access to GameObjects that follow the joints. This can be useful if another GameObject should track a joint continuously.

    Available joints are listed in the TrackedHandJoint enum.

    Note

    Joint object are destroyed when hand tracking is lost! Make sure that any scripts using the joint object handle the null case gracefully to avoid errors!

    Accessing a given hand controller

    A specific hand controller is often available, e.g. when handling input events. In this case the joint data can be requested directly from the device, using the IMixedRealityHand interface.

    Polling joint pose from controller

    The TryGetJoint function returns false if the requested joint is not available for some reason. In that case the resulting pose will be MixedRealityPose.ZeroIdentity.

    public void OnSourceDetected(SourceStateEventData eventData)
    {
      var hand = eventData.Controller as IMixedRealityHand;
      if (hand != null)
      {
        if (hand.TryGetJoint(TrackedHandJoint.IndexTip, out MixedRealityPose jointPose)
        {
          // ...
        }
      }
    }
    

    Joint transform from hand visualizer

    Joint objects can be requested from the controller visualizer.

    public void OnSourceDetected(SourceStateEventData eventData)
    {
      var handVisualizer = eventData.Controller.Visualizer as IMixedRealityHandVisualizer;
      if (handVisualizer != null)
      {
        if (handVisualizer.TryGetJointTransform(TrackedHandJoint.IndexTip, out Transform jointTransform)
        {
          // ...
        }
      }
    }
    

    Simplified joint data access

    If no specific controller is given then utility classes are provided for convenient access to hand joint data. These functions request joint data from the first available hand device currently tracked.

    Polling joint pose from HandJointUtils

    HandJointUtils is a static class that queries the first active hand device.

    if (HandJointUtils.TryGetJointPose(TrackedHandJoint.IndexTip, Handedness.Right, out MixedRealityPose pose))
    {
        // ...
    }
    

    Joint transform from hand joint service

    IMixedRealityHandJointService keeps a persistent set of GameObjects for tracking joints.

    var handJointService = CoreServices.GetInputSystemDataProvider<IMixedRealityHandJointService>();
    if (handJointService != null)
    {
        Transform jointTransform = handJointService.RequestJointTransform(TrackedHandJoint.IndexTip, Handedness.Right);
        // ...
    }
    

    Hand tracking events

    The input system provides events as well, if polling data from controllers directly is not desirable.

    Joint events

    IMixedRealityHandJointHandler handles updates of joint positions.

    public class MyHandJointEventHandler : IMixedRealityHandJointHandler
    {
        public Handedness myHandedness;
    
        void IMixedRealityHandJointHandler.OnHandJointsUpdated(InputEventData<IDictionary<TrackedHandJoint, MixedRealityPose>> eventData)
        {
            if (eventData.Handedness == myHandedness)
            {
                if (eventData.InputData.TryGetValue(TrackedHandJoint.IndexTip, out MixedRealityPose pose))
                {
                    // ...
                }
            }
        }
    }
    

    Mesh events

    IMixedRealityHandMeshHandler handles changes of the articulated hand mesh.

    Note that hand meshes are not enabled by default.

    public class MyHandMeshEventHandler : IMixedRealityHandMeshHandler
    {
        public Handedness myHandedness;
        public Mesh myMesh;
    
        public void OnHandMeshUpdated(InputEventData<HandMeshInfo> eventData)
        {
            if (eventData.Handedness == myHandedness)
            {
                myMesh.vertices = eventData.InputData.vertices;
                myMesh.normals = eventData.InputData.normals;
                myMesh.triangles = eventData.InputData.triangles;
    
                if (eventData.InputData.uvs != null && eventData.InputData.uvs.Length > 0)
                {
                    myMesh.uv = eventData.InputData.uvs;
                }
    
                // ...
            }
        }
    }
    

    Known issues

    .NET Native

    There is currently a known issue with Master builds using the .NET backend. In .NET Native, IInspectable pointers cannot be marshaled from native to managed code using Marshal.GetObjectForIUnknown. The MRTK uses this to obtain the SpatialCoordinateSystem in order to receive hand and eye data from the platform.

    We've provided DLL source as a workaround for this issue, in the native Mixed Reality Toolkit repo. Please follow the instructions in the README there and copy the resulting binaries into a Plugins folder in your Unity assets. After that, the WindowsMixedRealityUtilities script provided in the MRTK will resolve the workaround for you.

    If you want to create your own DLL or include this workaround in an existing one, the core of the workaround is:

    extern "C" __declspec(dllexport) void __stdcall MarshalIInspectable(IUnknown* nativePtr, IUnknown** inspectable)
    {
        *inspectable = nativePtr;
    }
    

    And its use in your C# Unity code:

    [DllImport("DotNetNativeWorkaround.dll", EntryPoint = "MarshalIInspectable")]
    private static extern void GetSpatialCoordinateSystem(IntPtr nativePtr, out SpatialCoordinateSystem coordinateSystem);
    
    private static SpatialCoordinateSystem GetSpatialCoordinateSystem(IntPtr nativePtr)
    {
        try
        {
            GetSpatialCoordinateSystem(nativePtr, out SpatialCoordinateSystem coordinateSystem);
            return coordinateSystem;
        }
        catch
        {
            UnityEngine.Debug.LogError("Call to the DotNetNativeWorkaround plug-in failed. The plug-in is required for correct behavior when using .NET Native compilation");
            return Marshal.GetObjectForIUnknown(nativePtr) as SpatialCoordinateSystem;
        }
    }
    
    • Improve this Doc
    In This Article
    • Hand tracking profile
    • Joint prefabs
    • Hand mesh prefab
    • Hand visualization settings
    • Scripting
      • Accessing a given hand controller
        • Polling joint pose from controller
        • Joint transform from hand visualizer
      • Simplified joint data access
        • Polling joint pose from HandJointUtils
        • Joint transform from hand joint service
      • Hand tracking events
        • Joint events
        • Mesh events
    • Known issues
      • .NET Native
    Back to top Generated by DocFX