mrtk_developmentreleases/2.0.0releases/2.1.0releases/2.2.0releases/2.3.0releases/2.4.0releases/2.5.0releases/2.5.1releases/2.5.2releases/2.5.3
  • Features and Architecture
  • API Documentation

We've moved!

Starting from MRTK 2.6, we are publishing both conceptual docs and API references on docs.microsoft.com. For conceptual docs, please visit our new landing page. For API references, please visit the MRTK-Unity section of the dot net API explorer. Existing content will remain here but will not be updated further.

  • Features and Architecture
  • Feature Overviews
  • Input System
  • How to Add Near Interaction
Search Results for

    Show / Hide Table of Contents
    • Welcome to MRTK
      • Installation Guide
      • Configuration
        • Using the Unity Package Manager
        • MRTK configuration dialog
        • Getting started with MRTK and XR SDK
      • Updates and Deployment
        • Updating from earlier versions
        • Upgrading from HTK
        • Building and Deploying MRTK
      • Packages and Release Notes
        • Release Notes
        • MRTK Packages
      • Performance and Best Practices
        • Performance
        • Hologram Stabilization
        • Using MRTK in large projects
    • Architecture
      • Overview
      • Framework and Runtime
      • Input System
        • Terminology
        • Core System
        • Controllers, Pointers, and Focus
      • Systems, Extension Services and Data Providers
    • Feature Overviews
      • Boundary System
        • Boundary System Overview
        • Configuring the Boundary Visualization
      • Camera System
        • Camera System Overview
        • Camera Settings Providers
          • Windows Mixed Reality Camera Settings
          • Unity AR Camera Settings [Experimental]
          • Creating a camera settings provider
      • Cross Platform Support
        • Configure MRTK for iOS and Android
        • Configure MRTK for Leap Motion Hand Tracking
        • Configure MRTK for Oculus Quest
      • Detecting Platform Capabilities
      • Diagnostics System
        • Diagnostics System Overview
        • Configuring the Diagnostics System
        • Using the Visual Profiler
      • Extension Services
        • Extension Service Creation Wizard
        • Scene Transition Service Overview
        • Hand Physics Service Overview
      • Input System
        • Input Overview
        • Input Actions
        • Input Events
        • Input Providers
          • Input Providers Overview
          • Creating an input data provider
        • Controllers
        • Eyes
          • Overview
          • Getting Started
          • Access Data via Code
          • Validate Tracking Calibration
        • Gaze
        • Gestures
        • Hands
        • How to Add Near Interaction
        • In-Editor Input Simulation
        • Pointers
        • Voice Input
          • Dictation
          • Speech (Command and Control)
      • Multi Scene System
        • Multi Scene System Overview
        • Scene Types
        • Content Scene Loading
        • Monitoring Content Loading
        • Lighting Scene Operations
      • Packaging
        • MRTK Packages
        • MRTK Modularization
      • Profiles
        • Profiles Overview
        • Configuration Guide
      • Rendering
        • MRTK Standard Shader
        • Material Instance Overview
        • Hover Light Overview
        • Proximity Light Overview
        • Clipping Primitive Overview
      • Services
        • What makes a mixed reality feature
        • What are the MixedRealityServiceRegistry and IMixedRealityServiceRegistrar
        • Extension services
      • Spatial Awareness System
        • Spatial Awareness Overview
        • Spatial Observers
          • Configuring Observers for Device
          • Configuring Observers for Editor
          • Controlling Observers via Code
          • Creating a custom Observer
      • Teleport System Overview
      • Tools
        • Dependency Window
        • Extension Service Creation Wizard
        • Holographic Remoting
        • Input Animation Recording
          • Input Animation File Format Specification
        • Migration Window
        • Optimize Window
        • Runtime tools
          • Controller Mapping tool
          • InputFeatureUsage tool
      • UX Building Blocks
        • Toolbox Window
        • Button
        • Bounds Control
        • Object Manipulator
        • Constraint Manager
        • Slate
        • System Keyboard
        • Interactable
        • Solvers
          • Tap to Place
        • Object Collection
        • Scrolling Object Collection
        • Tooltips
        • Slider
        • Hand Menu
        • Near Menu
        • App Bar
        • Fingertip Visualization
        • Progress Indicator
        • Dialog [Experimental]
        • Hand Coach [Experimental]
        • Pulse Shader [Experimental]
        • Dock Control [Experimental]
        • HoloLens Keyboard Helpers [Experimental]
        • Rigged Hand Visualizer [Experimental]
        • Elastic System [Experimental]
        • Bounding Box [Obsolete]
        • Manipulation Handler [Obsolete]
      • Example Scenes
        • Examples Hub
        • Hand Interaction Example
        • Eye Tracking Interaction Example
    • Contributing
      • Contributing Overview
      • Coding Guidelines
      • Writing and Running Tests
      • Writing Documentation
      • Pull Requests
      • Experimental Features
      • Breaking Changes
      • How to use DocFX
    • Planning
      • Roadmap
    • Notice
    • Authors

    How to add near interaction in MRTK

    Near interactions come in the form of touches and grabs. Touch and grab events are raised as pointer events by the PokePointer and SpherePointer, respectively.

    Three key steps are required to listen for touch and/or grab input events on a particular GameObject.

    1. Ensure the relevant pointer is registered in the main MRTK Configuration Profile.
    2. Ensure the desired GameObject has the appropriate grab or touch script component and Unity Collider.
    3. Implement an input handler interface on an attached script to the desired GameObject to listen for the grab or touch events.

    Add grab interactions

    1. Ensure a SpherePointer is registered in the MRTK Pointer profile.

      The default MRTK profile and the default HoloLens 2 profile already contain a SpherePointer. One can confirm a SpherePointer will be created by selecting the MRTK Configuration Profile and navigating to Input > Pointers > Pointer Options. The default GrabPointer prefab (Assets/MRTK/SDK/Features/UX/Prefabs/Pointers) should be listed with a Controller Type of Articulated Hand. A custom prefab can be utilized as long as it implements the SpherePointer class.

      Grab Pointer Profile Example

      The default grab pointer queries for nearby objects in a cone around the grab point to match the default Hololens 2 interface.

      Conical Grab Pointer

    2. On the GameObject that should be grabbable, add a NearInteractionGrabbable, as well as a collider.

      Make sure the layer of the GameObject is on a grabbable layer. By default, all layers except Spatial Awareness and Ignore Raycasts are grabbable. See which layers are grabbable by inspecting the Grab Layer Masks in your GrabPointer prefab.

    3. On the GameObject or one of its ancestors, add a script component that implements the IMixedRealityPointerHandler interface. Any ancestor of the object with the NearInteractionGrabbable will be able to receive pointer events, as well.

    Grab code example

    Below is a script that will print if an event is a touch or grab. In the relevant IMixedRealityPointerHandler interface function, one can look at the type of pointer that triggers that event via the MixedRealityPointerEventData. If the pointer is a SpherePointer, the interaction is a grab.

    public class PrintPointerEvents : MonoBehaviour, IMixedRealityPointerHandler
    {
        public void OnPointerDown(MixedRealityPointerEventData eventData)
        {
            if (eventData.Pointer is SpherePointer)
            {
                Debug.Log($"Grab start from {eventData.Pointer.PointerName}");
            }
            if (eventData.Pointer is PokePointer)
            {
                Debug.Log($"Touch start from {eventData.Pointer.PointerName}");
            }
        }
    
        public void OnPointerClicked(MixedRealityPointerEventData eventData) {}
        public void OnPointerDragged(MixedRealityPointerEventData eventData) {}
        public void OnPointerUp(MixedRealityPointerEventData eventData) {}
    }
    

    Add touch interactions

    The process for adding touch interactions on UnityUI elements is different than for vanilla 3D GameObjects. You can skip to the following section, Unity UI, for enabling Unity UI components.

    For both types of UX elements though, ensure a PokePointer is registered in the MRTK Pointer profile.

    The default MRTK profile and the default HoloLens 2 profile already contain a PokePointer. One can confirm a PokePointer will be created by selecting the MRTK Configuration Profile and navigate to Input > Pointers > Pointer Options. The default PokePointer (Assets/MRTK/SDK/Features/UX/Prefabs/Pointers) prefab should be listed with a Controller Type of Articulated Hand. A custom prefab can be utilized as long as it implements the PokePointer class.

    Poke Pointer Profile Example

    3D GameObjects

    There are two different ways of adding touch interactions to 3D GameObjects, depending on if your 3d object should only have a single touchable plane, or of if it should be touchable based on its entire collider. The first way is typically on objects with BoxColliders, where it is desired to only have a single face of the collider react to touch events. The other is for objects that need to be touchable from any direction based on their collider.

    Single face touch

    This is useful to enable situations where only a single face needs to be touchable. This option assumes that the game object has a BoxCollider. it's possible to use this with non-BoxCollider objects, in which case the 'Bounds' and 'Local Center' properties much be manually set to configure the touchable plane (i.e. Bounds should be set to a non-zero-zero value).

    1. On the GameObject that should be touchable, add a BoxCollider and a [NearInteractionTouchable] (xref:Microsoft.MixedReality.Toolkit.Input.NearInteractionTouchable) component.

      1. Set Events to Receive to Touch if using the [IMixedRealityTouchHandler] (xref:Microsoft.MixedReality.Toolkit.Input.IMixedRealityTouchHandler) interface in your component script below.

      2. Click Fix bounds and Fix center

      NearInteractionTouchable Gizmos Example

    2. On that object or one of its ancestors, add a script component that implements the IMixedRealityTouchHandler interface. Any ancestor of the object with the [NearInteractionTouchable] (xref:Microsoft.MixedReality.Toolkit.Input.NearInteractionTouchable) will be able to receive pointer events, as well.

    Note

    In the editor scene view with the NearInteractionTouchable GameObject selected, notice a white outline square and arrow. The arrow points to the "front" of the touchable. The collidable will only be touchable from that direction. To make a collider touchable from all directions, see the section on arbitrary collider touch. NearInteractionTouchable Gizmos Example

    Arbitrary collider touch

    This is useful to enable situations where the game object needs to be touchable along its entire collider face. For example, this can be used to enable touch interactions for an object with a SphereCollider, where the entire collider needs to be touchable.

    1. On the GameObject that should be touchable, add a collider and a [NearInteractionTouchableVolume] (xref:Microsoft.MixedReality.Toolkit.Input.NearInteractionTouchableVolume) component.

      1. Set Events to Receive to Touch if using the [IMixedRealityTouchHandler] (xref:Microsoft.MixedReality.Toolkit.Input.IMixedRealityTouchHandler) interface in your component script below.
    2. On that object or one of its ancestors, add a script component that implements the IMixedRealityTouchHandler interface. Any ancestor of the object with the [NearInteractionTouchable] (xref:Microsoft.MixedReality.Toolkit.Input.NearInteractionTouchable) will be able to receive pointer events, as well.

    Unity UI

    1. Add/ensure there is a UnityUI canvas in the scene.

    2. On the GameObject that should be touchable, add a NearInteractionTouchableUnityUI component.

      1. Set Events to Receive to Touch if using the IMixedRealityTouchHandler interface in your component script below.
    3. On that object or one of its ancestors, add a script component that implements the IMixedRealityTouchHandler interface. Any ancestor of the object with the NearInteractionTouchableUnityUI will be able to receive pointer events as well.

    Important

    On the NearInteractionTouchable script component, for the property Events to Receive there are two options: Pointer and Touch. Set Events to Receive to Pointer if using the IMixedRealityPointerHandler interface and set to Touch if using the IMixedRealityTouchHandler interface in your component script that responds/handles the input events.

    Touch code example

    The code below demonstrates a MonoBehaviour that can be attached to a GameObject with a NearInteractionTouchable variant component and respond to touch input events.

    public class TouchEventsExample : MonoBehaviour, IMixedRealityTouchHandler
    {
        public void OnTouchStarted(HandTrackingInputEventData eventData)
        {
            string ptrName = eventData.Pointer.PointerName;
            Debug.Log($"Touch started from {ptrName}");
        }
        public void OnTouchCompleted(HandTrackingInputEventData eventData) {}
        public void OnTouchUpdated(HandTrackingInputEventData eventData) { }
    }
    

    Near interaction script examples

    Touch events

    This example creates a cube, makes it touchable, and changes color on touch.

    public static void MakeChangeColorOnTouch(GameObject target)
    {
        // Add and configure the touchable
        var touchable = target.AddComponent<NearInteractionTouchableVolume>();
        touchable.EventsToReceive = TouchableEventType.Pointer;
    
        var material = target.GetComponent<Renderer>().material;
        // Change color on pointer down and up
        var pointerHandler = target.AddComponent<PointerHandler>();
        pointerHandler.OnPointerDown.AddListener((e) => material.color = Color.green);
        pointerHandler.OnPointerUp.AddListener((e) => material.color = Color.magenta);
    }
    

    Grab events

    The below example shows how to make a GameObject draggable. Assumes that the game object has a collider on it.

    public static void MakeNearDraggable(GameObject target)
    {
        // Instantiate and add grabbable
        target.AddComponent<NearInteractionGrabbable>();
    
        // Add ability to drag by re-parenting to pointer object on pointer down
        var pointerHandler = target.AddComponent<PointerHandler>();
        pointerHandler.OnPointerDown.AddListener((e) =>
        {
            if (e.Pointer is SpherePointer)
            {
                target.transform.parent = ((SpherePointer)(e.Pointer)).transform;
            }
        });
        pointerHandler.OnPointerUp.AddListener((e) =>
        {
            if (e.Pointer is SpherePointer)
            {
                target.transform.parent = null;
            }
        });
    }
    

    Useful APIs

    • NearInteractionGrabbable
    • NearInteractionTouchable
    • NearInteractionTouchableUnityUI
    • NearInteractionTouchableVolume
    • IMixedRealityTouchHandler
    • IMixedRealityPointerHandler

    See also

    • Input Overview
    • Pointers
    • Input Events
    • Improve this Doc
    In This Article
    • Add grab interactions
      • Grab code example
    • Add touch interactions
      • 3D GameObjects
      • Single face touch
      • Arbitrary collider touch
      • Unity UI
        • Touch code example
    • Near interaction script examples
      • Touch events
      • Grab events
    • Useful APIs
    • See also
    Back to top Generated by DocFX