releases/2.5.4mrtk_developmentreleases/2.0.0releases/2.1.0releases/2.2.0releases/2.3.0releases/2.4.0releases/2.5.1releases/2.5.2releases/2.5.3
  • Features and Architecture
  • API Documentation

We've moved!

Starting from MRTK 2.6, we are publishing both conceptual docs and API references on docs.microsoft.com. For conceptual docs, please visit our new landing page. For API references, please visit the MRTK-Unity section of the dot net API explorer. Existing content will remain here but will not be updated further.

  • Features and Architecture
  • Feature Overviews
  • Input System
  • In-Editor Input Simulation
Search Results for

    Show / Hide Table of Contents
    • Welcome to MRTK
      • Installation Guide
      • Configuration
        • Using the Unity Package Manager
        • MRTK configuration dialog
        • Getting started with MRTK and XR SDK
      • Updates and Deployment
        • Updating from earlier versions
        • Upgrading from HTK
        • Building and Deploying MRTK
      • Packages and Release Notes
        • Release Notes
        • MRTK Packages
      • Performance and Best Practices
        • Performance
        • Hologram Stabilization
        • Using MRTK in large projects
    • Architecture
      • Overview
      • Framework and Runtime
      • Input System
        • Terminology
        • Core System
        • Controllers, Pointers, and Focus
      • Systems, Extension Services and Data Providers
    • Feature Overviews
      • Boundary System
        • Boundary System Overview
        • Configuring the Boundary Visualization
      • Camera System
        • Camera System Overview
        • Camera Settings Providers
          • Windows Mixed Reality Camera Settings
          • Unity AR Camera Settings [Experimental]
          • Creating a camera settings provider
      • Cross Platform Support
        • Configure MRTK for iOS and Android
        • Configure MRTK for Leap Motion Hand Tracking
        • Configure MRTK for Oculus Quest
      • Detecting Platform Capabilities
      • Diagnostics System
        • Diagnostics System Overview
        • Configuring the Diagnostics System
        • Using the Visual Profiler
      • Extension Services
        • Extension Service Creation Wizard
        • Scene Transition Service Overview
        • Hand Physics Service Overview
      • Input System
        • Input Overview
        • Input Actions
        • Input Events
        • Input Providers
          • Input Providers Overview
          • Creating an input data provider
        • Controllers
        • Eyes
          • Overview
          • Getting Started
          • Access Data via Code
          • Validate Tracking Calibration
        • Gaze
        • Gestures
        • Hands
        • How to Add Near Interaction
        • In-Editor Input Simulation
        • Pointers
        • Voice Input
          • Dictation
          • Speech (Command and Control)
      • Multi Scene System
        • Multi Scene System Overview
        • Scene Types
        • Content Scene Loading
        • Monitoring Content Loading
        • Lighting Scene Operations
      • Packaging
        • MRTK Packages
        • MRTK Modularization
      • Profiles
        • Profiles Overview
        • Configuration Guide
      • Rendering
        • MRTK Standard Shader
        • Material Instance Overview
        • Hover Light Overview
        • Proximity Light Overview
        • Clipping Primitive Overview
      • Services
        • What makes a mixed reality feature
        • What are the MixedRealityServiceRegistry and IMixedRealityServiceRegistrar
        • Extension services
      • Spatial Awareness System
        • Spatial Awareness Overview
        • Spatial Observers
          • Configuring Observers for Device
          • Configuring Observers for Editor
          • Controlling Observers via Code
          • Creating a custom Observer
      • Teleport System Overview
      • Tools
        • Dependency Window
        • Extension Service Creation Wizard
        • Holographic Remoting
        • Input Animation Recording
          • Input Animation File Format Specification
        • Migration Window
        • Optimize Window
        • Runtime tools
          • Controller Mapping tool
          • InputFeatureUsage tool
      • UX Building Blocks
        • Toolbox Window
        • Button
        • Bounds Control
        • Object Manipulator
        • Constraint Manager
        • Slate
        • System Keyboard
        • Interactable
        • Solvers
          • Tap to Place
        • Object Collection
        • Scrolling Object Collection
        • Tooltips
        • Slider
        • Hand Menu
        • Near Menu
        • App Bar
        • Fingertip Visualization
        • Progress Indicator
        • Dialog [Experimental]
        • Hand Coach [Experimental]
        • Pulse Shader [Experimental]
        • Dock Control [Experimental]
        • HoloLens Keyboard Helpers [Experimental]
        • Rigged Hand Visualizer [Experimental]
        • Elastic System [Experimental]
        • Bounding Box [Obsolete]
        • Manipulation Handler [Obsolete]
      • Example Scenes
        • Examples Hub
        • Hand Interaction Example
        • Eye Tracking Interaction Example
    • Contributing
      • Contributing Overview
      • Coding Guidelines
      • Writing and Running Tests
      • Writing Documentation
      • Pull Requests
      • Experimental Features
      • Breaking Changes
      • How to use DocFX
    • Planning
      • Roadmap
    • Notice
    • Authors

    Input simulation service

    The Input Simulation Service emulates the behavior of devices and platforms that may not be available in the Unity editor. Examples include:

    • HoloLens or VR device head tracking
    • HoloLens hand gestures
    • HoloLens 2 articulated hand tracking
    • HoloLens 2 eye tracking
    • VR device controllers

    Users can use a conventional keyboard and mouse combination to control simulated devices at runtime. This approach allows testing of interactions in the Unity editor without first deploying to a device.

    Warning

    This does not work when using Unity's XR Holographic Emulation > Emulation Mode = "Simulate in Editor". Unity's in-editor simulation will take control away from MRTK's input simulation. In order to use the MRTK input simulation service, you will need to set XR Holographic Emulation to Emulation Mode = "None"

    Enabling the input simulation service

    Input simulation is enabled by default in the profiles that ship with MRTK.

    Input simulation is an optional Mixed Reality service though and can be removed as a data provider in the Input System profile.

    Under the Input System Data provider configuration, the Input Simulation service can be configured with the following.

    • Type must be Microsoft.MixedReality.Toolkit.Input > InputSimulationService.
    • Supported Platform(s) by default includes all Editor platforms, since the service uses keyboard and mouse input.
    Note

    The Input Simulation service can be used on other platform endpoints such as standalone by changing the Supported Platform(s) property to include the desired targets. Input Simulation Supported Platforms

    Input simulation tools window

    Enable the input simulation tools window from the Mixed Reality Toolkit > Utilities > Input Simulation menu. This window provides access to the state of input simulation during play mode.

    Viewport buttons

    A prefab for in-editor buttons to control basic hand placement can be specified in the input simulation profile under Indicators Prefab. This is an optional utility, the same features can be accessed in the input simulation tools window.

    Note

    The viewport indicators are disabled by default, as they currently can sometimes interfere with Unity UI interactions. See issue #6106. To enable, add the InputSimulationIndicators prefab to Indicators Prefab.

    Hand icons show the state of the simulated hands:

    • Untracked hand icon The hand is not tracking. Click to enable the hand.
    • Tracked hand icon The hand is tracked, but not controlled by the user. Click to hide the hand.
    • Controlled hand icon The hand is tracked and controlled by the user. Click to hide the hand.
    • Reset hand icon Click to reset the hand to default position.

    In editor input simulation cheat sheet

    Press Left Ctrl + H in the HandInteractionExamples scene to bring up a cheat sheet with Input simulation controls.

    Input Simulation Cheat Sheet

    Camera control

    Head movement can be emulated by the Input Simulation Service.

    Rotating the camera

    1. Hover over the viewport editor window. You may need to click the window to give it input focus if button presses don't work.
    2. Press and hold the Mouse Look Button (default: Right mouse button).
    3. Move the mouse in the viewport window to rotate the camera.
    4. Use the scroll wheel to roll the camera around the view direction.

    Camera rotation speed can be configured by changing the Mouse Look Speed setting in the input simulation profile.

    Alternatively, use the Look Horizontal/Look Vertical axes to rotate the camera (default: game controller right thumbstick).

    Moving the camera

    Use the Move Horizontal/Move Vertical axes to move the camera (default: WASD keys or game controller left thumbstick).

    Camera position and rotation angles can be set explicitly in the tools window, as well. The camera can be reset to its default using the Reset button.

    Controller simulation

    The input simulation supports emulated controller devices (i.e. motion controllers and hands). These virtual controllers can interact with any object that supports regular controllers, such as buttons or grabbable objects.

    Controller simulation mode

    In the input simulation tools window the Default Controller Simulation Mode setting switches between three distinct input models. This default mode can also be set in the input simulation profile.

    • Articulated Hands: Simulates a fully articulated hand device with joint position data.

      Emulates HoloLens 2 interaction model.

      Interactions that are based on the precise positioning of the hand or use touching can be simulated in this mode.

    • Hand Gestures: Simulates a simplified hand model with air tap and basic gestures.

      Emulates HoloLens interaction model.

      Focus is controlled using the Gaze pointer. The Air Tap gesture is used to interact with buttons.

    • Motion Controller: Simulates a motion controller used with VR headsets that works similarly to far interactions with Articulated Hands.

      Emulates VR headset with controllers interaction model.

      The trigger, grab and menu keys are simulated via keyboard and mouse input.

    Simulating controller movement

    Press and hold the Left/Right Controller Manipulation Key (default: Left Shift for left controller and Space for right controller) to gain control of either controller. While the manipulation key is pressed, the controller will appear in the viewport. Once the manipulation key is released, the controllers will disappear after a short Controller Hide Timeout.

    Controllers can be toggled on and frozen relative to the camera in the input simulation tools window or by pressing the Toggle Left/Right Controller Key (default: T for left and Y for right). Press the toggle key again to hide the controllers again. To manipulate the controllers, the Left/Right Controller Manipulation Key needs to be held. Double tapping the Left/Right Controller Manipulation Key can also toggle the controllers on/off.

    Mouse movement will move the controller in the view plane. Controllers can be moved further or closer to the camera using the mouse wheel.

    To rotate controllers using the mouse, hold both the Left/Right Controller Manipulation Key (Left Shift or Space) and the Controller Rotate Button (default: Left Ctrl button) and then move the mouse to rotate the controller. Controller rotation speed can be configured by changing the Mouse Controller Rotation Speed setting in the input simulation profile.

    All hand placement can also changed in the input simulation tools window, including resetting hands to default.

    Additional profile settings

    • Controller Depth Multiplier controls the sensitivity of the mouse scroll wheel depth movement. A larger number will speed up controller zoom.
    • Default Controller Distance is the initial distance of controllers from the camera. Clicking the Reset button controllers will also place controllers at this distance.
    • Controller Jitter Amount adds random motion to controllers. This feature can be used to simulate inaccurate controller tracking on the device, and ensure that interactions work well with noisy input.

    Hand gestures

    Hand gestures such as pinching, grabbing, poking, etc. can also be simulated.

    1. Enable hand control using the Left/Right Controller Manipulation Key (Left Shift or Space)

    2. While manipulating, press and hold a mouse button to perform a hand gesture.

    Each of the mouse buttons can be mapped to transform the hand shape into a different gesture using the Left/Middle/Right Mouse Hand Gesture settings. The Default Hand Gesture is the shape of the hand when no button is pressed.

    Note

    The Pinch gesture is the only gesture that performs the "Select" action at this point.

    One-hand manipulation

    1. Press and hold Left/Right Controller Manipulation Key (Left Shift or Space)
    2. Point at object
    3. Hold mouse button to pinch
    4. Use your mouse to move the object
    5. Release the mouse button to stop interaction

    Two-hand manipulation

    For manipulating objects with two hands at the same time, the persistent hand mode is recommended.

    1. Toggle on both hands by pressing the toggle keys (T/Y).
    2. Manipulate one hand at a time:
      1. Hold Space to control the right hand
      2. Move the hand to where you want to grab the object
      3. Press the left mouse button to activate the Pinch gesture.
      4. Release Space to stop controlling the right hand. The hand will be frozen in place and be locked into the Pinch gesture since it is no longer being manipulated.
    3. Repeat the process with the other hand, grabbing the same object in a second spot.
    4. Now that both hands are grabbing the same object, you can move either of them to perform two-handed manipulation.

    GGV (Gaze, Gesture, and Voice) interaction

    By default, GGV interaction is enabled in-editor while there are no articulated hands present in the scene.

    1. Rotate the camera to point the gaze cursor at the interactable object (right mouse button)
    2. Click and hold left mouse button to interact
    3. Rotate the camera again to manipulate the object

    You can turn this off by toggling the Is Hand Free Input Enabled option inside the Input Simulation Profile.

    In addition, you can use simulated hands for GGV interaction

    1. Enable GGV simulation by switching Hand Simulation Mode to Gestures in the Input Simulation Profile
    2. Rotate the camera to point the gaze cursor at the interactable object (right mouse button)
    3. Hold Space to control the right hand
    4. Click and hold left mouse button to interact
    5. Use your mouse to move the object
    6. Release the mouse button to stop interaction

    Motion controller interaction

    The simulated motion controllers can be manipulated the same way articulated hands are. The interaction model is similar to far interaction of articulated hand while the trigger, grab and menu keys are mapped to left mouse button, G and M key respectively.

    Eye tracking

    Eye tracking simulation can be enabled by checking the Simulate Eye Position option in the Input Simulation Profile. This should not be used with GGV or motion controller style interactions (so ensure that Default Controller Simulation Mode is set to Articulated Hand).

    See also

    • Input System profile.
    • Improve this Doc
    In This Article
    • Enabling the input simulation service
    • Input simulation tools window
    • Viewport buttons
    • In editor input simulation cheat sheet
    • Camera control
      • Rotating the camera
      • Moving the camera
    • Controller simulation
      • Controller simulation mode
      • Simulating controller movement
      • Additional profile settings
      • Hand gestures
      • One-hand manipulation
      • Two-hand manipulation
      • GGV (Gaze, Gesture, and Voice) interaction
      • Motion controller interaction
      • Eye tracking
    • See also
    Back to top Generated by DocFX