releases/2.5.4mrtk_developmentreleases/2.1.0releases/2.2.0releases/2.3.0releases/2.4.0releases/2.5.0releases/2.5.1releases/2.5.2releases/2.5.3
  • Guides
  • API Documentation

We've moved!

Starting from MRTK 2.6, we are publishing both conceptual docs and API references on docs.microsoft.com. For conceptual docs, please visit our new landing page. For API references, please visit the MRTK-Unity section of the dot net API explorer. Existing content will remain here but will not be updated further.

  • Guides
  • Architecture
  • Input System
  • Controllers, Pointers, and Focus

    Show / Hide Table of Contents
    • Getting Started with MRTK
      • Upgrading from HTK
      • Updating from RC2
      • Release Notes
      • Building and Deploying MRTK
      • Performance
      • Hologram Stabilization
    • Architecture
      • Overview
      • Framework and Runtime
      • Input System
        • Terminology
        • Core System
        • Controllers, Pointers, and Focus
    • Feature Overviews
      • Profiles
      • Profiles Configuration
      • Input
        • Input Overview
        • Input Providers
        • Input Events
        • Input Actions
        • Controllers
        • Pointers
        • Gestures
        • Speech(Voice command)
        • Dictation
        • Hands
        • Gaze
        • Eyes
      • In-Editor Input Simulation
      • UX Building Blocks
        • Interactable
        • Button
        • Bounding Box
        • Object Manipulation
        • Sliders
        • Fingertip Visualization
        • App Bar
        • Object Collection
        • Slate
        • System Keyboard
        • Tooltips
        • Solvers
        • Hand Interaction Example
        • Eye Tracking Interaction Example
      • Detecting Platform Capabilities
      • MRTK Standard Shader
      • Spatial Awareness
        • Spatial Awareness Overview
        • Configuring the Spatial Awareness Mesh Observer
        • Spatial Object Mesh Observer
        • Usage Guide
      • Multi Scene System
        • Multi Scene System Overview
        • Scene Types
        • Content Scene Loading
        • Monitoring Content Loading
        • Lighting Scene Operations
      • Teleport System
      • Boundary System
        • Boundary System Overview
        • Configuring the Boundary Visualization
      • Diagnostics System
        • Diagnostics System Overview
        • Configuring the Diagnostics System
        • Using the Visual Profiler
      • Services
        • What makes a mixed reality feature
        • What are the MixedRealityServiceRegistry and IMixedRealityServiceRegistrar
      • Packages
        • MRTK Packages
        • MRTK Componentization
      • Tools
        • Dependency Window
        • Optimize Window
        • Input Animation Recording
          • Input Animation File Format Specification
        • Extension Service Creation Wizard
      • Scene Transition Service
      • Experimental Features
    • Contributing
      • Contributing Overview
      • Feature Contribution Process
      • Coding Guidelines
      • Documentation guide
      • Developer portal generation with DocFX
      • Testing
      • Breaking Changes
    • Planning
      • Roadmap
    • Notice
    • Authors

    Controllers, Pointers, and Focus

    Controllers, pointers, and focus are higher level concepts that build upon the foundation established by the core input system. Together they provide a large portion of the mechanism for interacting with objects in the scene.

    Controllers

    Controllers are representations of a physical controller (6-degree of freedom, articulated hand, etc). They are created by device managers, and are responsible for communicating with the corresponding underlying system and translating that data into MRTK-shaped data and events.

    For example, on the Windows Mixed Reality platform, the WindowsMixedRealityArticulatedHand is a controller that is responsible for interfacing with the underlying Windows hand tracking APIs to get information about the joints, pose, and other properties of the hand. It is responsible for turning this data into relevant MRTK events (for example, by calling RaisePoseInputChanged or RaiseHandJointsUpdated) and by updating its own internal state so that queries for TryGetJointPose will return correct data.

    Generally, a controller's lifecycle will involve:

    1. A controller gets created by a device manager upon detection of a new source(for example, the detects and starts tracking a hand).

    2. In the controller's Update() loop, it calls into its underlying API system.

    3. In the same update loop, it raises input event changes by calling directly into the core input system itself (for example, raising HandMeshUpdated, or HandJointsUpdated).

    Pointers and focus

    Pointers are used to interact with game objects. This section describes how pointers are created, how they get updated, and how they determine the object(s) that are in focus. It will also cover the different types of pointers that exist and the scenarios in which they are active.

    Pointer categories

    Pointers generally fall into one of the following categories:

    • Far pointers

      These types of pointers are used to interact with objects that are far away from the user (where far away is defined as simply “not near”). These types of pointers generally cast lines that can go far into the world and allow the user the interact with and manipulate objects that are not immediately next to them.

    • Near pointers

      These types of pointers are used to interact with objects that are close enough to the user to grab, touch, and manipulate. Generally these types of pointers interact with objects by looking for objects in the nearby vicinity (either by doing raycasting at small distances, doing spherical casting looking for objects in the vicinity, or enumerating lists of objects that are considered grabbable/touchable).

    • Teleport pointers

      These types of pointers plug into the teleportation system to handle moving the user to the location targeted by the pointer.

    Pointer Mediation

    Because a single controller can have multiple pointers (for example, the articulated hand can have both near and far interaction pointers), there exists a component that is responsible for mediating which pointer should be active.

    For example, as the user’s hand approaches a pressable button, the ShellHandRayPointer should stop showing, and the PokePointer should be engaged.

    This is handled by the DefaultPointerMediator, which is responsible for determining which pointers based on the state of all pointers. One of the key things this does is disable far pointers when a near pointer is close to an object

    It's possible to provide an alternate implementation of the pointer mediator by changing the PointerMediator property on the pointer profile.

    FocusProvider

    The FocusProvider is the workhorse that is responsible for iterating over the list of all pointers and figuring out what the focused object is for each pointer.

    In each Update() call, this will:

    1. Update all of the pointers, by raycasting and doing hit detection as-configured by the pointer itself (for example, the sphere pointer could specify the SphereOverlap raycastMode, so FocusProvider will do a sphere-based collision)

    2. Update the focused object on a per-pointer basis (i.e. if an object gained focus, it would also trigger events to those object, if an object lost focus, it would trigger focus lost, etc).

    Pointer configuration and lifecycle

    Pointers can be configured in the Pointers section of the input system profile.

    The lifetime of a pointer is generally the following:

    1. A device manager will detect the presence of a controller - this device manager will then create the set of pointers associated with the controller via a call to RequestPointers.

    2. The FocusProvider, in its Update() loop, will iterate over all of the valid pointers and do the associated raycast or hit detection logic - this is used to determine the object that is focused by each particular pointer.

      • Because it's possible to have multiple sources of input active at the same time (for example, two hands active present), it's also possible to have multiple objects that have focus at the same time.
    3. The device manager, upon discovering that a controller source was lost, will tear down the pointers associated with the lost controller.

    • Improve this Doc
    In This Article
    • Controllers
    • Pointers and focus
      • Pointer categories
    • Pointer Mediation
      • FocusProvider
      • Pointer configuration and lifecycle
    Back to top Generated by DocFX