releases/2.5.4mrtk_developmentreleases/2.1.0releases/2.2.0releases/2.3.0releases/2.4.0releases/2.5.0releases/2.5.1releases/2.5.2releases/2.5.3
  • Guides
  • API Documentation

We've moved!

Starting from MRTK 2.6, we are publishing both conceptual docs and API references on docs.microsoft.com. For conceptual docs, please visit our new landing page. For API references, please visit the MRTK-Unity section of the dot net API explorer. Existing content will remain here but will not be updated further.

  • Guides
  • Architecture
  • Input System
  • Core System

    Show / Hide Table of Contents
    • Getting Started with MRTK
      • Upgrading from HTK
      • Updating from RC2
      • Release Notes
      • Building and Deploying MRTK
      • Performance
      • Hologram Stabilization
    • Architecture
      • Overview
      • Framework and Runtime
      • Input System
        • Terminology
        • Core System
        • Controllers, Pointers, and Focus
    • Feature Overviews
      • Profiles
      • Profiles Configuration
      • Input
        • Input Overview
        • Input Providers
        • Input Events
        • Input Actions
        • Controllers
        • Pointers
        • Gestures
        • Speech(Voice command)
        • Dictation
        • Hands
        • Gaze
        • Eyes
      • In-Editor Input Simulation
      • UX Building Blocks
        • Interactable
        • Button
        • Bounding Box
        • Object Manipulation
        • Sliders
        • Fingertip Visualization
        • App Bar
        • Object Collection
        • Slate
        • System Keyboard
        • Tooltips
        • Solvers
        • Hand Interaction Example
        • Eye Tracking Interaction Example
      • Detecting Platform Capabilities
      • MRTK Standard Shader
      • Spatial Awareness
        • Spatial Awareness Overview
        • Configuring the Spatial Awareness Mesh Observer
        • Spatial Object Mesh Observer
        • Usage Guide
      • Multi Scene System
        • Multi Scene System Overview
        • Scene Types
        • Content Scene Loading
        • Monitoring Content Loading
        • Lighting Scene Operations
      • Teleport System
      • Boundary System
        • Boundary System Overview
        • Configuring the Boundary Visualization
      • Diagnostics System
        • Diagnostics System Overview
        • Configuring the Diagnostics System
        • Using the Visual Profiler
      • Services
        • What makes a mixed reality feature
        • What are the MixedRealityServiceRegistry and IMixedRealityServiceRegistrar
      • Packages
        • MRTK Packages
        • MRTK Componentization
      • Tools
        • Dependency Window
        • Optimize Window
        • Input Animation Recording
          • Input Animation File Format Specification
        • Extension Service Creation Wizard
      • Scene Transition Service
      • Experimental Features
    • Contributing
      • Contributing Overview
      • Feature Contribution Process
      • Coding Guidelines
      • Documentation guide
      • Developer portal generation with DocFX
      • Testing
      • Breaking Changes
    • Planning
      • Roadmap
    • Notice
    • Authors

    Core System

    At the heart of the input system is the MixedRealityInputSystem, which is a service that is responsible for initializing and operating all of the input related functionality associated with the MRTK.

    Note

    It is assumed that readers have already read and have a basic understanding of the terminology section.

    This service is responsible for:

    • Reading the input system profile
    • Starting the various device managers (for example, OpenVR, Windows Mixed Reality, Unity Touch). The set of device managers that are instantiated is configured by the input system profile.
    • Instantiation of the GazeProvider, which is a component that is responsible for providing HoloLens1-style head gaze information in addition to HoloLens2-style eye gaze information.
    • Instantiation of the FocusProvider, which is a component that is responsible for determining objects that have focus. This is described in more depth in the pointers and focus section of the documentation.
    • Providing registration points for all input events (as global listeners).
    • Providing event dispatch capabilities for those input events.

    Input events

    Input events are generally fired on two different channels:

    Objects in focus

    Events can be sent directly to a GameObject that has focus. For example, an object might have a script that implements IMixedRealityTouchHandler. This object would get touch events when focused by a hand that is near it. These types of events go "up" the GameObject hierarchy until it finds a GameObject that is capable of handling the event.

    This is done by using ExecuteHierarchy in DispatchEventToObjectFocusedByPointer.

    Global listeners

    Events can be sent to global listeners. It's possible to register for all input events by using the input system's IMixedRealityEventSystem interface. It's recommended to use the RegisterHandler method for registering for global events - the deprecated Register function will cause listeners to get notified of ALL input events, rather than just input events of a particular type (where type is defined by the event interface).

    Note that fallback listeners are another type of global listeners which are also discouraged because they will receive every single input event that hasn't been handled elsewhere in the scene.

    Order of event dispatch

    Generally, events are sent to listeners in the following way. Note that if any of the steps below mark the event as handled, the event dispatch process stops.

    1. Event is sent to global listeners.
    2. Event is sent to modal dialogs of the focused object.
    3. Event is sent to the focused object.
    4. Event is sent to fallback listeners.

    Device Managers / Data Providers

    These entities are responsible for interfacing with lower-level APIs (such as Windows Mixed Reality APIs, or OpenVR APIs) and translating data from those systems into ones that fit the MRTK's higher level input abstractions. They are responsible for detecting, creating, and managing the lifetime of controllers.

    The basic flow of a device manager involves:

    1. The device manager is instantiated by the input system service.
    2. The device manager registers with its underlying system (for example, the Windows Mixed Reality device manager will register for gesture and interaction events).
    3. It creates controllers that it discovers from the underlying system (for example the provider could detect the presence of articulated hands)
    4. In its Update() loop, call UpdateController() to poll for the new state of the underlying system and update its controller representation.
    • Improve this Doc
    In This Article
    • Input events
      • Objects in focus
      • Global listeners
      • Order of event dispatch
    • Device Managers / Data Providers
    Back to top Generated by DocFX