mrtk_developmentreleases/2.0.0releases/2.1.0releases/2.2.0releases/2.3.0releases/2.4.0releases/2.5.0releases/2.5.1releases/2.5.2releases/2.5.3
  • Features and Architecture
  • API Documentation

We've moved!

Starting from MRTK 2.6, we are publishing both conceptual docs and API references on docs.microsoft.com. For conceptual docs, please visit our new landing page. For API references, please visit the MRTK-Unity section of the dot net API explorer. Existing content will remain here but will not be updated further.

  • Features and Architecture
  • Feature Overviews
  • Input System
  • Gestures
Search Results for

    Show / Hide Table of Contents
    • Welcome to MRTK
      • Installation Guide
      • Configuration
        • Using the Unity Package Manager
        • MRTK configuration dialog
        • Getting started with MRTK and XR SDK
      • Updates and Deployment
        • Updating from earlier versions
        • Upgrading from HTK
        • Building and Deploying MRTK
      • Packages and Release Notes
        • Release Notes
        • MRTK Packages
      • Performance and Best Practices
        • Performance
        • Hologram Stabilization
        • Using MRTK in large projects
    • Architecture
      • Overview
      • Framework and Runtime
      • Input System
        • Terminology
        • Core System
        • Controllers, Pointers, and Focus
      • Systems, Extension Services and Data Providers
    • Feature Overviews
      • Boundary System
        • Boundary System Overview
        • Configuring the Boundary Visualization
      • Camera System
        • Camera System Overview
        • Camera Settings Providers
          • Windows Mixed Reality Camera Settings
          • Unity AR Camera Settings [Experimental]
          • Creating a camera settings provider
      • Cross Platform Support
        • Configure MRTK for iOS and Android
        • Configure MRTK for Leap Motion Hand Tracking
        • Configure MRTK for Oculus Quest
      • Detecting Platform Capabilities
      • Diagnostics System
        • Diagnostics System Overview
        • Configuring the Diagnostics System
        • Using the Visual Profiler
      • Extension Services
        • Extension Service Creation Wizard
        • Scene Transition Service Overview
        • Hand Physics Service Overview
      • Input System
        • Input Overview
        • Input Actions
        • Input Events
        • Input Providers
          • Input Providers Overview
          • Creating an input data provider
        • Controllers
        • Eyes
          • Overview
          • Getting Started
          • Access Data via Code
          • Validate Tracking Calibration
        • Gaze
        • Gestures
        • Hands
        • How to Add Near Interaction
        • In-Editor Input Simulation
        • Pointers
        • Voice Input
          • Dictation
          • Speech (Command and Control)
      • Multi Scene System
        • Multi Scene System Overview
        • Scene Types
        • Content Scene Loading
        • Monitoring Content Loading
        • Lighting Scene Operations
      • Packaging
        • MRTK Packages
        • MRTK Modularization
      • Profiles
        • Profiles Overview
        • Configuration Guide
      • Rendering
        • MRTK Standard Shader
        • Material Instance Overview
        • Hover Light Overview
        • Proximity Light Overview
        • Clipping Primitive Overview
      • Services
        • What makes a mixed reality feature
        • What are the MixedRealityServiceRegistry and IMixedRealityServiceRegistrar
        • Extension services
      • Spatial Awareness System
        • Spatial Awareness Overview
        • Spatial Observers
          • Configuring Observers for Device
          • Configuring Observers for Editor
          • Controlling Observers via Code
          • Creating a custom Observer
      • Teleport System Overview
      • Tools
        • Dependency Window
        • Extension Service Creation Wizard
        • Holographic Remoting
        • Input Animation Recording
          • Input Animation File Format Specification
        • Migration Window
        • Optimize Window
        • Runtime tools
          • Controller Mapping tool
          • InputFeatureUsage tool
      • UX Building Blocks
        • Toolbox Window
        • Button
        • Bounds Control
        • Object Manipulator
        • Constraint Manager
        • Slate
        • System Keyboard
        • Interactable
        • Solvers
          • Tap to Place
        • Object Collection
        • Scrolling Object Collection
        • Tooltips
        • Slider
        • Hand Menu
        • Near Menu
        • App Bar
        • Fingertip Visualization
        • Progress Indicator
        • Dialog [Experimental]
        • Hand Coach [Experimental]
        • Pulse Shader [Experimental]
        • Dock Control [Experimental]
        • HoloLens Keyboard Helpers [Experimental]
        • Rigged Hand Visualizer [Experimental]
        • Elastic System [Experimental]
        • Bounding Box [Obsolete]
        • Manipulation Handler [Obsolete]
      • Example Scenes
        • Examples Hub
        • Hand Interaction Example
        • Eye Tracking Interaction Example
    • Contributing
      • Contributing Overview
      • Coding Guidelines
      • Writing and Running Tests
      • Writing Documentation
      • Pull Requests
      • Experimental Features
      • Breaking Changes
      • How to use DocFX
    • Planning
      • Roadmap
    • Notice
    • Authors

    Gestures

    Gestures are input events based on human hands. There are two types of devices that raise gesture input events in MRTK:

    • Windows Mixed Reality devices such as HoloLens. This describes pinching motions ("Air Tap") and tap-and-hold gestures.

      For more information on HoloLens gestures see the Windows Mixed Reality Gestures documentation.

      WindowsMixedRealityDeviceManager wraps the Unity XR.WSA.Input.GestureRecognizer to consume Unity's gesture events from HoloLens devices.

    • Touch screen devices.

      UnityTouchController wraps the Unity Touch class that supports physical touch screens.

    Both of these input sources use the Gesture Settings profile to translate Unity's Touch and Gesture events respectively into MRTK's Input Actions. This profile can be found under the Input System Settings profile.

    Gesture events

    Gesture events are received by implementing one of the gesture handler interfaces: IMixedRealityGestureHandler or IMixedRealityGestureHandler<TYPE> (see table of event handlers).

    See Example Scene for an example implementation of a gesture event handler.

    When implementing the generic version, the OnGestureCompleted and OnGestureUpdated events can receive typed data of the following types:

    • Vector2 - 2D position gesture. Produced by touch screens to inform of their deltaPosition.
    • Vector3 - 3D position gesture. Produced by HoloLens to inform of:
      • cumulativeDelta of a manipulation event
      • normalizedOffset of a navigation event
    • Quaternion - 3D rotation gesture. Available to custom input sources but not currently produced by any of the existing ones.
    • MixedRealityPose - Combined 3D position/rotation gesture. Available to custom input sources but not currently produced by any of the existing ones.

    Order of events

    There are two principal chains of events, depending on user input:

    • "Hold":

      1. Hold tap:
        • start Manipulation
      2. Hold tap beyond HoldStartDuration:
        • start Hold
      3. Release tap:
        • complete Hold
        • complete Manipulation
    • "Move":

      1. Hold tap:
        • start Manipulation
      2. Hold tap beyond HoldStartDuration:
        • start Hold
      3. Move hand beyond NavigationStartThreshold:
        • cancel Hold
        • start Navigation
      4. Release tap:
        • complete Manipulation
        • complete Navigation

    Example scene

    The HandInteractionGestureEventsExample (Assets/MRTK/Examples/Demos/HandTracking/Scenes) scene shows how to use the pointer Result to spawn an object at the hit location.

    The GestureTester (Assets/MRTK/Examples/Demos/HandTracking/Script) script is an example implementation to visualize gesture events via GameObjects. The handler functions change the color of indicator objects and display the last recorded event in text objects in the scene.

    • Improve this Doc
    In This Article
    • Gesture events
    • Order of events
    • Example scene
    Back to top Generated by DocFX