releases/2.5.4mrtk_developmentreleases/2.1.0releases/2.2.0releases/2.3.0releases/2.4.0releases/2.5.0releases/2.5.1releases/2.5.2releases/2.5.3
  • Guides
  • API Documentation

We've moved!

Starting from MRTK 2.6, we are publishing both conceptual docs and API references on docs.microsoft.com. For conceptual docs, please visit our new landing page. For API references, please visit the MRTK-Unity section of the dot net API explorer. Existing content will remain here but will not be updated further.

  • Guides
  • Architecture
  • Input System
  • Terminology

    Show / Hide Table of Contents
    • Getting Started with MRTK
      • Upgrading from HTK
      • Updating from RC2
      • Release Notes
      • Building and Deploying MRTK
      • Performance
      • Hologram Stabilization
    • Architecture
      • Overview
      • Framework and Runtime
      • Input System
        • Terminology
        • Core System
        • Controllers, Pointers, and Focus
    • Feature Overviews
      • Profiles
      • Profiles Configuration
      • Input
        • Input Overview
        • Input Providers
        • Input Events
        • Input Actions
        • Controllers
        • Pointers
        • Gestures
        • Speech(Voice command)
        • Dictation
        • Hands
        • Gaze
        • Eyes
      • In-Editor Input Simulation
      • UX Building Blocks
        • Interactable
        • Button
        • Bounding Box
        • Object Manipulation
        • Sliders
        • Fingertip Visualization
        • App Bar
        • Object Collection
        • Slate
        • System Keyboard
        • Tooltips
        • Solvers
        • Hand Interaction Example
        • Eye Tracking Interaction Example
      • Detecting Platform Capabilities
      • MRTK Standard Shader
      • Spatial Awareness
        • Spatial Awareness Overview
        • Configuring the Spatial Awareness Mesh Observer
        • Spatial Object Mesh Observer
        • Usage Guide
      • Multi Scene System
        • Multi Scene System Overview
        • Scene Types
        • Content Scene Loading
        • Monitoring Content Loading
        • Lighting Scene Operations
      • Teleport System
      • Boundary System
        • Boundary System Overview
        • Configuring the Boundary Visualization
      • Diagnostics System
        • Diagnostics System Overview
        • Configuring the Diagnostics System
        • Using the Visual Profiler
      • Services
        • What makes a mixed reality feature
        • What are the MixedRealityServiceRegistry and IMixedRealityServiceRegistrar
      • Packages
        • MRTK Packages
        • MRTK Componentization
      • Tools
        • Dependency Window
        • Optimize Window
        • Input Animation Recording
          • Input Animation File Format Specification
        • Extension Service Creation Wizard
      • Scene Transition Service
      • Experimental Features
    • Contributing
      • Contributing Overview
      • Feature Contribution Process
      • Coding Guidelines
      • Documentation guide
      • Developer portal generation with DocFX
      • Testing
      • Breaking Changes
    • Planning
      • Roadmap
    • Notice
    • Authors

    Input System

    The input system is one of the largest systems out of all the features offered by the MRTK. So many things within the toolkit build on top of it (pointers, focus, prefabs). The code within the input system is what allows for natural interactions like grab and rotate across platforms.

    The input system has some of its own terminology that are worth defining:

    • Data providers

      The input settings in the input profile have references to entities known as data providers - another word that describes these are device managers. These are components whose job is to extend the MRTK input system by interfacing with a specific underlying system. An example of a provider is the Windows Mixed Reality provider, whose job it is to talk with the underlying Windows Mixed Reality APIs, and then translate the data from those APIs into MRTK-specific input concepts below. Another example would be the OpenVR provider (whose job it is to talk to Unity-abstracted version of OpenVR APIs and then translate that data into MRTK input concepts).

    • Controller

      A representation of a physical controller (whether it’s a 6-degree-of-freedom controller, a HoloLens 1-style hand with gesture support, a fully articulated hand, a leap motion controller, etc.). Controllers are spawned by device managers (i.e. the WMR device manager will spawn a controller and manage its lifetime when it sees an articulated hand coming into existence).

    • Pointer

      Controller use pointers to interact with game objects. For example, the near interaction pointer is responsible to detecting when the hand (which is a controller) is close to objects that advertise themselves as supporting ‘near interaction’. Other examples for pointers are teleportation or far pointers (i.e. the shell hand ray pointer) that use far raycasts to engage with content that is longer than arms length from the user.

      Pointers are created by the device manager, and then attached to an input source. To get all of the pointers for a controller, do: controller.InputSource.Pointers

      Note that a controller can be associated with many different pointers at the same time – in order to ensure that this doesn’t devolve into chaos, there is a pointer mediator which controls which pointers are allowed to be active (for example, the mediator will disable far interaction pointers when near interaction is detected).

    • Focus

      Pointer events are sent to objects in focus. Focus selection will vary by pointer type - a hand ray pointer will use raycasts, while a poke pointer will use spherecasts. An object must implement IMixedRealityFocusHandler to receive focus. It's possible to globally register an object to receive unfiltered pointer events, but this approach is not recommended.

      The component that updates which objects are in focus is the FocusProvider

    • Cursor

      An entity associated with a pointer that gives additional visual cues around pointer interaction. For example, the FingerCursor will render a ring around your finger, and may rotate that ring when your finger is close to ‘near interactable’ objects. A pointer can be associated with a single cursor at time.

    • Interaction and Manipulation

      Objects can be tagged with an interaction or manipulation script (this may be Interactable.cs, or something like NearInteractionGrabbable.cs/ManipulationHandler.cs).

      For example, NearInteractionGrabbable and NearInteractionTouchable allow for certain pointers (especially near interaction pointers) to know which objects can be focused on.

      Interactable and ManipulationHandler, are examples of components that listen to pointer events to modify UI visuals or move/scale/rotate game objects.

    The image below captures the high level build up (from bottom up) of the MRTK input stack:

    Input System Diagram

    • Improve this Doc
    Back to top Generated by DocFX