releases/2.5.4mrtk_developmentreleases/2.1.0releases/2.2.0releases/2.3.0releases/2.4.0releases/2.5.0releases/2.5.1releases/2.5.2releases/2.5.3
  • Guides
  • API Documentation

We've moved!

Starting from MRTK 2.6, we are publishing both conceptual docs and API references on docs.microsoft.com. For conceptual docs, please visit our new landing page. For API references, please visit the MRTK-Unity section of the dot net API explorer. Existing content will remain here but will not be updated further.

  • Guides
  • Feature Overviews
  • In-Editor Input Simulation

    Show / Hide Table of Contents
    • Getting Started with MRTK
      • Upgrading from HTK
      • Updating from RC2
      • Release Notes
      • Building and Deploying MRTK
      • Performance
      • Hologram Stabilization
    • Architecture
      • Overview
      • Framework and Runtime
      • Input System
        • Terminology
        • Core System
        • Controllers, Pointers, and Focus
    • Feature Overviews
      • Profiles
      • Profiles Configuration
      • Input
        • Input Overview
        • Input Providers
        • Input Events
        • Input Actions
        • Controllers
        • Pointers
        • Gestures
        • Speech(Voice command)
        • Dictation
        • Hands
        • Gaze
        • Eyes
      • In-Editor Input Simulation
      • UX Building Blocks
        • Interactable
        • Button
        • Bounding Box
        • Object Manipulation
        • Sliders
        • Fingertip Visualization
        • App Bar
        • Object Collection
        • Slate
        • System Keyboard
        • Tooltips
        • Solvers
        • Hand Interaction Example
        • Eye Tracking Interaction Example
      • Detecting Platform Capabilities
      • MRTK Standard Shader
      • Spatial Awareness
        • Spatial Awareness Overview
        • Configuring the Spatial Awareness Mesh Observer
        • Spatial Object Mesh Observer
        • Usage Guide
      • Multi Scene System
        • Multi Scene System Overview
        • Scene Types
        • Content Scene Loading
        • Monitoring Content Loading
        • Lighting Scene Operations
      • Teleport System
      • Boundary System
        • Boundary System Overview
        • Configuring the Boundary Visualization
      • Diagnostics System
        • Diagnostics System Overview
        • Configuring the Diagnostics System
        • Using the Visual Profiler
      • Services
        • What makes a mixed reality feature
        • What are the MixedRealityServiceRegistry and IMixedRealityServiceRegistrar
      • Packages
        • MRTK Packages
        • MRTK Componentization
      • Tools
        • Dependency Window
        • Optimize Window
        • Input Animation Recording
          • Input Animation File Format Specification
        • Extension Service Creation Wizard
      • Scene Transition Service
      • Experimental Features
    • Contributing
      • Contributing Overview
      • Feature Contribution Process
      • Coding Guidelines
      • Documentation guide
      • Developer portal generation with DocFX
      • Testing
      • Breaking Changes
    • Planning
      • Roadmap
    • Notice
    • Authors

    Input Simulation Service

    The Input Simulation Service emulates the behaviour of devices and platforms that may not be available in the Unity editor. Examples include:

    • HoloLens or VR device head tracking
    • HoloLens hand gestures
    • HoloLens 2 articulated hand tracking
    • HoloLens 2 eye tracking

    Users can use a conventional keyboard and mouse combination to control simulated devices at runtime. This allows testing of interactions in the Unity editor without first deploying to a device.

    Warning

    This does not work when using Unity's XR Holographic Emulation > Emulation Mode = "Simulate in Editor". Unity's in-editor simulation will take control away from MRTK's input simulation. In order to use the MRTK input simulation service, you will need to set XR Holographic Emulation to Emulation Mode = "None"

    Enabling the Input Simulation Service

    Input simulation is enabled by default in MRTK.

    Input simulation is an optional Mixed Reality service. It can be added as a data provider in the Input System profile.

    • Type must be Microsoft.MixedReality.Toolkit.Input > InputSimulationService.
    • Platform(s) should always be Windows Editor since the service depends on keyboard and mouse input.
    • Profile has all settings for input simulation.
    Warning

    Any type of profile can be assigned to services at the time of this writing. If you assign a different profile to the service, make sure to use a profile of type Input Simulation or it will not work!

    Open the linked profile to access settings for input simulation.

    Camera Control

    Head movement can be emulated by the Input Simulation Service.

    Rotating the camera

    1. Hover over the viewport editor window.

      You may need to click the window to give it input focus if button presses don't work.

    2. Press and hold the Mouse Look Button (default: Right mouse button).

    3. Move the mouse in the viewport window to rotate the camera.

    Moving the camera

    Press and hold the movement keys (W/A/S/D for forward/left/back/right).

    Hand Simulation

    The input simulation supports emulated hand devices. These virtual hands can interact with any object that supports regular hand devices, such as buttons or grabbable objects.

    The Hand Simulation Mode switches between two distinct input models.

    • Articulated Hands: Simulates a fully articulated hand device with joint position data.

      Emulates HoloLens 2 interaction model.

      Interactions that are based on precise positioning of the hand or use touching can be simulated in this mode.

    • Gestures: Simulates a simplified hand model with air tap and basic gestures.

      Emulates HoloLens interaction model.

      Focus is controlled using the Gaze pointer. The Air Tap gesture is used to interact with buttons.

    Controlling hand movement

    Press and hold the Left/Right Hand Manipulation Key (default: Left Shift/Space for left/right respectively) to gain control of either hand. While the manipulation key is pressed, the hand will appear in the viewport. Mouse movement will move the hand in the view plane.

    Once the manipulation key is released the hands will disappear after a short Hand Hide Timeout. To toggle hands on permanently, press the Toggle Left/Right Hand Key (default: T/Y for left/right respectively). Press the toggle key again to hide the hands again.

    Hands can be moved further or closer to the camera using the mouse wheel. By default the hand will move somewhat slowly in response to mouse scroll, and this can be made faster by changing the Hand Depth Multiplier to a larger number.

    The initial distance from the camera that the hand appears at is controlled by Default Hand Distance.

    By default, the simulated hand joints will be perfectly still. Note that on devices there will always be some amount of jitter/noise due to the underlying hand tracking. You can see this on the device when you have hand mesh or joints enabled (and see how it has slightly jitter even if you have your hand perfectly still). It's possible to simulate jitter by changing Hand Jitter Amount to a positive value (for example, 0.1 as is shown in the image above).

    Hands can be rotated when precise direction is required.

    • Yaw rotates around the Y axis (default: E/Q keys for clockwise/counter-clockwise rotation)
    • Pitch rotates around the X axis (default: F/R keys for clockwise/counter-clockwise rotation)
    • Roll rotates around the Z axis (default: X/Z keys for clockwise/counter-clockwise rotation)

    Hand Gestures

    Hand gestures such as pinching, grabbing, poking, etc. can also be simulated.

    1. First enable hand control using the manipulation keys (Left Shift/Space)

      Alternatively toggle the hands on/off using the toggle keys (T/Y).

    2. While manipulating, press and hold a mouse button to perform a hand gesture.

    Each of the mouse buttons can be mapped to transform the hand shape into a different gesture using the Left/Middle/Right Mouse Hand Gesture settings. The Default Hand Gesture is the shape of the hand when no button is pressed.

    Note

    The Pinch gesture is the only gesture that performs the "Select" action at this point.

    One-Hand Manipulation

    1. Press and hold hand control key (Space/Left Shift)
    2. Point at object
    3. Hold mouse button to pinch
    4. Use mouse to move the object
    5. Release mouse button to stop interaction

    Two-Hand Manipulation

    For manipulating objects with two hands at the same time the persistent hand mode is recommended.

    1. Toggle on both hands by pressing the toggle keys (T/Y).
    2. Manipulate one hand at a time:
    3. Hold Space to control the right hand
    4. Move the hand to where you want to grab the object
    5. Press mouse button to activate the Pinch gesture. In persistent mode the gesture will remain active when you release the mouse button.
    6. Repeat the process with the other hand, grabbing the same object in a second spot.
    7. Now that both hands are grabbing the same object, you can move either of them to perform two-handed manipulation.

    GGV Interaction

    1. Enable GGV simulation by switching Hand Simulation Mode to Gestures in the Input Simulation Profile
    1. Rotate the camera to point the gaze cursor at the interactable object (right mouse button)
    2. Hold Space to control the right hand
    3. Click and hold left mouse button to interact
    4. Rotate the camera again to manipulate the object

    Eye tracking

    Eye tracking simulation can be enabled by checking the Simulate Eye Position option in the Input Simulation Profile. This should not be used with GGV style interactions (so ensure that Hand Simulation Mode is set to Articulated).

    • Improve this Doc
    In This Article
    • Input Simulation Service
      • Enabling the Input Simulation Service
    • Camera Control
      • Rotating the camera
      • Moving the camera
    • Hand Simulation
      • Controlling hand movement
      • Hand Gestures
      • One-Hand Manipulation
      • Two-Hand Manipulation
      • GGV Interaction
      • Eye tracking
    Back to top Generated by DocFX