releases/2.5.4mrtk_developmentreleases/2.1.0releases/2.2.0releases/2.3.0releases/2.4.0releases/2.5.0releases/2.5.1releases/2.5.2releases/2.5.3
  • Guides
  • API Documentation

We've moved!

Starting from MRTK 2.6, we are publishing both conceptual docs and API references on docs.microsoft.com. For conceptual docs, please visit our new landing page. For API references, please visit the MRTK-Unity section of the dot net API explorer. Existing content will remain here but will not be updated further.

  • Guides

    Show / Hide Table of Contents
    • Getting Started with MRTK
      • Upgrading from HTK
      • Updating from RC2
      • Release Notes
      • Building and Deploying MRTK
      • Performance
      • Hologram Stabilization
    • Architecture
      • Overview
      • Framework and Runtime
      • Input System
        • Terminology
        • Core System
        • Controllers, Pointers, and Focus
    • Feature Overviews
      • Profiles
      • Profiles Configuration
      • Input
        • Input Overview
        • Input Providers
        • Input Events
        • Input Actions
        • Controllers
        • Pointers
        • Gestures
        • Speech(Voice command)
        • Dictation
        • Hands
        • Gaze
        • Eyes
      • In-Editor Input Simulation
      • UX Building Blocks
        • Interactable
        • Button
        • Bounding Box
        • Object Manipulation
        • Sliders
        • Fingertip Visualization
        • App Bar
        • Object Collection
        • Slate
        • System Keyboard
        • Tooltips
        • Solvers
        • Hand Interaction Example
        • Eye Tracking Interaction Example
      • Detecting Platform Capabilities
      • MRTK Standard Shader
      • Spatial Awareness
        • Spatial Awareness Overview
        • Configuring the Spatial Awareness Mesh Observer
        • Spatial Object Mesh Observer
        • Usage Guide
      • Multi Scene System
        • Multi Scene System Overview
        • Scene Types
        • Content Scene Loading
        • Monitoring Content Loading
        • Lighting Scene Operations
      • Teleport System
      • Boundary System
        • Boundary System Overview
        • Configuring the Boundary Visualization
      • Diagnostics System
        • Diagnostics System Overview
        • Configuring the Diagnostics System
        • Using the Visual Profiler
      • Services
        • What makes a mixed reality feature
        • What are the MixedRealityServiceRegistry and IMixedRealityServiceRegistrar
      • Packages
        • MRTK Packages
        • MRTK Componentization
      • Tools
        • Dependency Window
        • Optimize Window
        • Input Animation Recording
          • Input Animation File Format Specification
        • Extension Service Creation Wizard
      • Scene Transition Service
      • Experimental Features
    • Contributing
      • Contributing Overview
      • Feature Contribution Process
      • Coding Guidelines
      • Documentation guide
      • Developer portal generation with DocFX
      • Testing
      • Breaking Changes
    • Planning
      • Roadmap
    • Notice
    • Authors

    Getting started with eye tracking in MRTK

    This page covers how to set up your Unity MRTK scene to use eye tracking in your app. The following assumes you are starting out with a fresh new scene. Alternatively, you can check out our already configured MRTK eye tracking examples with tons of great examples that you can directly build on.

    Eye tracking requirements checklist

    For eye tracking to work correctly, the following requirements must be met. If you are new to eye tracking on HoloLens 2 and to how eye tracking is set up in MRTK, don't worry! We will go into detail on how to address each of them further below.

    1. An 'Eye Gaze Data Provider' must be added to the input system. This provides eye tracking data from the platform.
    2. The GazeProvider must have its 'Use Eye Tracking' property set to true. Note that true is the default value (so no special action is required unless you have actively unchecked this property.)
    3. The 'GazeInput' capability must be enabled in the application manifest. Currently this is only available in Visual Studio and through the MRTK build tool
    4. The HoloLens must be eye calibrated for the current user. Check out our sample for detecting whether a user is eye calibrated or not.

    IMPORTANT: If any of the above requirements are not met, the application will automatically fall back to head-based gaze tracking.

    A note on the GazeInput capability

    The MRTK-provided build tooling (i.e. Mixed Reality Toolkit -> Utilities -> Build Window) can automatically enable the GazeInput capability for you. In order to do this, you need to make sure that the 'Gaze Input Capability' is checked on the 'Appx Build Options' tab:

    MRTK Build Tools

    This tooling will find the AppX manifest after the Unity build is completed and manually add the GazeInput capability. Note that this tooling is NOT active when using Unity's built-in Build Window (i.e. File -> Build Settings). When using Unity's build window, the capability will need to manually added after the Unity build.

    Setting up eye tracking step-by-step

    Setting up the scene

    Set up the MixedRealityToolkit by simply clicking 'Mixed Reality Toolkit -> Configure…' in the menu bar.

    MRTK

    Setting up the MRTK profiles required for eye tracking

    After setting up your MRTK scene, you will be asked to choose a profile for MRTK. You can simply select DefaultMixedRealityToolkitConfigurationProfile and then select the 'Copy & Customize' option.

    MRTK

    Create an "Eye Gaze Data Provider"

    • Click on the 'Input' tab in your MRTK profile.

    • To edit the default one ( 'DefaultMixedRealityInputSystemProfile' ), click the 'Clone' button next to it. A 'Clone Profile' menu appears. Simply click on 'Clone' at the bottom of that menu.

    • Double click on your new input profile and select '+ Add Data Provider'.

    • Create a new data provider:

      • Under Type select 'Microsoft.MixedReality.Toolkit.WindowsMixedReality.Input' -> 'WindowsMixedRealityEyeGazeDataProvider'

      • For Platform(s) select 'Windows Universal'.

    MRTK

    Enabling eye tracking in the GazeProvider

    In HoloLens v1, head gaze was used as primary pointing technique. While head gaze is still available via the GazeProvider in MRTK which is attached to your Camera, you can check to use eye gaze instead by ticking the 'UseEyeTracking' checkbox as shown in the screenshot below.

    MRTK

    NOTE: Developers can toggle between eye tracking and head tracking in code by changing the 'UseEyeTracking' property of 'GazeProvider'.

    Simulating eye tracking in the Unity Editor

    You can simulate eye tracking input in the Unity Editor to ensure that events are correctly triggered before deploying the app to your HoloLens 2. The eye gaze signal is simulated by simply using the camera's location as eye gaze origin and the camera's forward vector as eye gaze direction. While this is great for initial testing, please note that it is not a good imitation for rapid eye movements. For this, it is better to ensure frequent tests of your eye-based interactions on the HoloLens 2.

    1. Enable simulated eye tracking:
      • Click on the 'Input' tab in your MRTK configuration profile.
      • From there, navigate to 'Input Data Providers' -> 'Input Simulation Service'.
      • Check the 'Simulate Eye Position' checkbox.

    MRTK

    1. Disable default head gaze cursor: In general, it is recommended to avoid showing an eye gaze cursor or if absolutely required to make it very subtle. We do recommend to hide the default head gaze cursor that is attached to the MRTK gaze pointer profile by default.
      • Navigate to your MRTK configuration profile -> 'Input' -> 'Pointers'
      • Clone the 'DefaultMixedRealityInputPointerProfile' to make changes to it.
      • At the top of the 'Pointer Settings', you should assign an invisible cursor prefab to the 'GazeCursor'. If you downloaded the MRTK Examples folder, you can simply reference the included 'EyeGazeCursor' prefab.

    MRTK

    Accessing eye gaze data

    Now that your scene is set up to use eye tracking, let's take a look at how to access it in your scripts: Accessing eye tracking data via EyeGazeProvider and eye-supported target selections.

    Testing your Unity app on a HoloLens 2

    Building your app with eye tracking should be similar to how you would compile other HoloLens 2 MRTK apps. The only difference is that the 'Gaze Input' capability is unfortunately not yet supported by Unity under 'Player Settings -> Publishing Settings -> Capabilities'. To use eye tracking on your HoloLens 2 device, you need to manually edit the package manifest that is part of your built Visual Studio project.

    Follow these steps:

    1. Build your Unity project as you would normally do for HoloLens 2.
    2. Open your compiled Visual Studio project and then open the 'Package.appxmanifest' in your solution.
    3. Make sure to tick the 'GazeInput' checkbox under Capabilities.

    Please note: You only have to do this if you build into a new build folder. This means that if you had already built your Unity project and set up the appxmanifest before and now target the same folder again, the appxmanifest should stay untouched.

    Enabling Gaze Input in Visual Studio

    You don't see a 'GazeInput' capability?

    • Check that your system meets the prerequisites for using MRTK (in particular the Windows SDK version).
    • You can also manually add the entry by opening the appxmanifest in an XML editor and adding the following:
      <Capabilities>
        <DeviceCapability Name="gazeInput" />
      </Capabilities>
    

    Have you eye calibrated?

    Finally, please don't forget to run through the eye calibration on your HoloLens 2. The eye tracking system will not return any input if the user is not calibrated. Easiest way to get to the calibration is by flipping up the visor and back down. A system notification should appear welcoming you as a new user and asking you to go through the eye calibration. Alternatively you can find the eye calibration in the system settings: Settings -> System -> Utilities -> Open Calibration.

    Do you see the eye tracking permission prompt?

    When starting the app on your HoloLens 2 for the first time, a prompt should pop up asking the user for permission to use eye tracking. If it is not showing up, then that is usually an indication that the 'GazeInput' capability was not set.

    After the permission prompt showed up once, it will not show up automatically again. If you "denied eye tracking permission", you can reset this in Settings -> Privacy -> Apps.


    This should get you started with using eye tracking in your MRTK Unity app. Don't forget to check out our MRTK eye tracking tutorials and samples demonstrating how to use eye tracking input and conveniently providing scripts that you can reuse in your projects.


    Back to "Eye tracking in the MixedRealityToolkit"

    • Improve this Doc
    In This Article
    • Eye tracking requirements checklist
      • A note on the GazeInput capability
    • Setting up eye tracking step-by-step
      • Setting up the scene
      • Setting up the MRTK profiles required for eye tracking
      • Create an "Eye Gaze Data Provider"
      • Enabling eye tracking in the GazeProvider
      • Simulating eye tracking in the Unity Editor
      • Accessing eye gaze data
      • Testing your Unity app on a HoloLens 2
        • Have you eye calibrated?
        • Do you see the eye tracking permission prompt?
    Back to top Generated by DocFX