Namespace Microsoft.MixedReality.Toolkit.Input
Classes
AnimatedCursor
Animated cursor is a cursor driven using an animator to inject state information and animate accordingly
AnimatedCursorContextData
AnimatedCursorData<T>
Data struct for cursor state information for the Animated Cursor, which leverages the Unity animation system. This defines a modification to an Unity animation parameter, based on cursor state.
AnimatedCursorStateData
ArticulatedHandDefinition
Defines the interactions and data that an articulated hand can provide.
BaseController
Base Controller class to inherit from for all controllers.
BaseControllerPointer
Base Pointer class for pointers that exist in the scene as GameObjects.
BaseCursor
Object that represents a cursor in 3D space.
BaseEyeFocusHandler
Base Component for handling Eye Focus on GameObjects.
BaseFocusHandler
Base Component for handling Focus on GameObjects.
BaseGenericInputSource
Base class for input sources that don't inherit from MonoBehaviour.
BaseHand
BaseHandVisualizer
BaseInputDeviceManager
Class providing a base implementation of the IMixedRealityInputDeviceManager interface.
BaseInputEventData
Base class of all input events.
BaseInputHandler
Base class for the Mixed Reality Toolkit's SDK input handlers.
BaseInputSimulationService
Base class for services that create simulated input devices.
BaseInputSourceDefinition
Defines the base interactions and data that an controller can provide.
BaseMousePointer
Base Mouse Pointer Implementation.
BaseNearInteractionTouchable
Base class for all NearInteractionTouchables.
ColliderNearInteractionTouchable
Obsolete base class for all touchables using colliders. Use BaseNearInteractionTouchable instead.
ControllerMappingLibrary
Helper utility to manage all the required Axis configuration for platforms, where required
ControllerPoseSynchronizer
Waits for a controller to be initialized, then synchronizes its transform position to a specified handedness.
CursorContextInfo
The cursor will display the context specified in this component if it is part of the targeted object
CursorModifier
Component that can be added to any GameObject with a Collider to modify the IMixedRealityCursor reacts when focused by a IMixedRealityPointer.
CurvePointer
Extends line pointer to support curves. Useful for teleportation or other situations where multiple raysteps need to be tested along a spline
DefaultPointerMediator
The default implementation for pointer mediation in MRTK which is responsible for determining which pointers are active based on the state of all pointers. For example, one of the key things this class does is disable far pointers when a near pointer is close to an object.
DefaultPrimaryPointerSelector
Default primary pointer selector. The primary pointer is chosen among all interaction enabled ones using the following rules in order:
- Currently pressed pointer that has been pressed for the longest
- Pointer that was released most recently
- Pointer that became interaction enabled most recently
DefaultRaycastProvider
The default implementation of IMixedRealityRaycastProvider.
DictationEventData
Describes an Input Event with voice dictation.
DictationHandler
Script used to start and stop recording sessions in the current dictation system and report the transcribed text via UnityEvents. For this script to work, a dictation system like 'Windows Dictation Input Provider' must be added to the Data Providers in the Input System profile.
DictationHandler.StringUnityEvent
EyeTrackingTarget
A game object with the "EyeTrackingTarget" script attached reacts to being looked at independent of other available inputs.
FingerCursor
Cursor used to aide in near finger interactions.
FocusEventData
Describes an Input Event associated with a specific pointer's focus state change.
FocusHandler
Utility component to hook up Unity events to the OnFocusEnter and OnFocusExit events.
FocusProvider
The focus provider handles the focused objects per input source.
GazePointerVisibilityStateMachine
Helper class for managing the visibility of the gaze pointer to match windows mixed reality and HoloLens 2 When application starts, gaze pointer is visible. Then when articulate hands / motion controllers appear, hide the gaze cursor. Whenever user says "select", make the gaze cursor appear.
GazeProvider
This class provides Gaze as an Input Source so users can interact with objects using their head.
GenericOpenVRControllerDefinition
GenericPointer
Base Class for pointers that don't inherit from MonoBehaviour.
GGVPointer
This class allows for HoloLens 1 style input, using a far gaze ray for focus with hand and gesture-based input and interaction across it.
HandBounds
Utility behavior to access the axis aligned bounds of IMixedRealityHands (or the proxy visualizer of IMixedRealityControllers).
HandJointService
HandJointUtils
HandMeshInfo
Stores pointers and transform information for Hand Mesh data provided by current platform. This is the data container for the IMixedRealityHandMeshHandler input system event interface.
HandRay
HandTrackingInputEventData
HPMotionControllerDefinition
InputActionHandler
Script used to handle input action events. Invokes Unity events when the configured input action starts or ends.
InputActionUnityEvent
Unity event for input action events. Contains the data of the input event that triggered the action.
InputAnimation
Contains a set of animation curves that describe motion of camera and hands.
InputAnimationMarker
A used-defined marker on the input animation timeline.
InputAnimationSerializationUtils
Functions for serializing input animation data to and from binary files.
InputEventData
Describes an Input Event that has a source id.
InputEventData<T>
Describes and input event with a specific type.
InputPlaybackService
Plays back input animation via the input simulation system.
InputRayUtils
Utilities for accessing position, rotation of rays.
InputRecordingBuffer
Container used to efficiently store a sequence of input animation keyframes while recording
InputRecordingBuffer.Keyframe
The input state for a single frame
InputRecordingService
Provides input recording into an internal buffer and exporting to files.
InputSimulationIndicators
A row of indicator buttons to control input simulation features.
InputSimulationService
Service that provides simulated mixed reality input information based on mouse and keyboard input in editor
InputSimulationWindow
Tools for simulating and recording input as well as playing back input animation in the Unity editor.
InputSystemGlobalHandlerListener
This component ensures that input events are forwarded to this component when focus or gaze is not required.
InputSystemGlobalListener
This component ensures that all input events are forwarded to this GameObject when focus or gaze is not required.
InteractiveMeshCursor
A cursor that looks and acts more like the shell cursor. A two part cursor with visual feedback for all cursor states
KeyBindingInspector
Inspector for KeyBindings. This shows a simple dropdown list for selecting a binding, as well as a button for binding keys by pressing them.
KeyBindingPopupWindow
Utility window that listens to input events to set a key binding. Pressing a key or mouse button will define the binding and then immediately close the popup.
KeyInputSystem
Utility class to poll input for key bindings and to simulate key presses Need to add mechanisms to poll and simulate input axis: https://github.com/microsoft/MixedRealityToolkit-Unity/issues/7659
LinePointer
A simple line pointer for drawing lines from the input source origin to the current pointer position.
ManualCameraControl
Class for manually controlling the camera in the Unity editor. Used by the Input Simulation Service.
MeshCursor
Object that represents a cursor in 3D space controlled by gaze.
MixedRealityCanvasInspector
Helper class to get CanvasUtility onto Canvas objects.
MixedRealityControllerAttribute
Attach to a controller device class to make it show up in the controller mapping profile.
MixedRealityControllerInfo
This script keeps track of the GameObjects representations for each button on the Mixed Reality Controllers. It also keeps track of the animation Transforms in order to properly animate according to user input.
MixedRealityControllerMappingProfile
New controller types can be registered by adding the MixedRealityControllerAttribute to the controller class.
MixedRealityControllerVisualizationProfile
Profile that determines relevant overrides and properties for controller visualization
MixedRealityControllerVisualizer
The Mixed Reality Visualization component is primarily responsible for synchronizing the user's current input with controller models.
MixedRealityEyeTrackingProfile
MixedRealityGesturesProfile
Configuration profile settings for setting up and consuming Input Actions.
MixedRealityHandTrackingProfile
MixedRealityInputActionMapping
Maps the capabilities of controllers, defining the physical inputs of a controller.
MixedRealityInputActionRulesProfile
MixedRealityInputActionsProfile
Configuration profile settings for setting up and consuming Input Actions.
MixedRealityInputModule
MixedRealityInputModule.PointerData
MixedRealityInputModuleEditor
MixedRealityInputRecordingProfile
Settings for recording input animation assets.
MixedRealityInputSimulationProfile
MixedRealityInputSimulationProfileInspector
MixedRealityInputSystem
The Mixed Reality Toolkit's specific implementation of the IMixedRealityInputSystem
MixedRealityInputSystemProfile
Configuration profile settings for setting up controller pointers.
MixedRealityInteractionMapping
Maps the capabilities of controllers, linking the physical inputs of a controller to a logical construct in a runtime project.
MixedRealityMouseInputProfile
MixedRealityMouseInputProfileInspector
MixedRealityPointerEventData
Describes an Input Event that involves a tap, click, or touch.
MixedRealityPointerProfile
Configuration profile settings for setting up controller pointers.
MixedRealitySpeechCommandsProfile
Configuration profile settings for setting up and consuming Speech Commands.
MouseControllerDefinition
Defines the base interactions and data that an controller can provide.
MouseDelta
Utility struct that provides mouse delta in pixels (screen space), normalized viewport coordinates, and world units.
MousePointer
The MousePointer represents a mouse cursor in world space. It uses spherical movement around the camera. Its movement is bound to screenspace, but based in the delta movement of the computer mouse.
MouseRotationProvider
Utility class to manage toggling of mouse rotation and associated features, such as cursor visibility/locking
NearInteractionGrabbable
Add a NearInteractionGrabbable component to any GameObject that has a collidable on it in order to make that collidable near grabbable.
Any IMixedRealityNearPointer will then dispatch pointer events to the closest near grabbable objects.
Additionally, the near pointer will send focus enter and exit events when the decorated object is the closest object to the near pointer
NearInteractionTouchable
Add a NearInteractionTouchable to your scene and configure a touchable surface in order to get PointerDown and PointerUp events whenever a PokePointer touches this surface.
NearInteractionTouchableInspector
NearInteractionTouchableInspectorBase
NearInteractionTouchableSurface
A near interaction object which is a flat surface and can be pressed in one direction.
NearInteractionTouchableUnityUI
Use a Unity UI RectTransform as touchable surface.
NearInteractionTouchableVolume
Add a NearInteractionTouchableVolume to your scene and configure a touchable volume in order to get PointerDown and PointerUp events whenever a PokePointer collides with this volume.
NearInteractionTouchableVolumeInspector
ObjectCursor
The object cursor can switch between different game objects based on its state. It simply links the game object to set to active with its associated cursor state.
OculusRemoteControllerDefinition
OculusTouchControllerDefinition
PointerClickHandler
This component handles pointer clicks from all types of input sources.
i.e. a primary mouse button click, motion controller selection press, or hand tap.
PointerHandler
Script used to raise Unity Events in response to pointer events.
PointerUnityEvent
Unity event for a pointer event. Contains the pointer event data.
PointerUtils
PokePointer
A near interaction pointer that generates touch events based on touchables in close proximity.
RiggedHandVisualizer
Hand visualizer that controls a hierarchy of transforms to be used by a SkinnedMeshRenderer Implementation is derived from LeapMotion RiggedHand and RiggedFinger and has visual parity
ScreenSpaceMousePointer
Uses the desktop mouse cursor instead of any mouse representation within the scene. Its movement is bound to screenspace.
ShellHandRayPointer
Implementation for default hand ray pointers shipped with MRTK. Primarily used with hands and motion controllers
SimpleHandDefinition
SimulatedArticulatedHand
SimulatedArticulatedHandPoses
This stores the joint pose JSON data that defines various articulated hand gestures for input simulation. The JSON data that defines each joint position and orientation is stored in strings to avoid file loading/targeting during runtime
SimulatedControllerDataProvider
Produces simulated data every frame that defines the position and rotation of the simulated controller.
SimulatedGestureHand
SimulatedHand
SimulatedHandData
Snapshot of simulated hand data.
SimulatedHandDataProvider
Produces simulated data every frame that defines joint positions.
SimulatedHandUtils
SimulatedMotionController
SimulatedMotionControllerData
Snapshot of simulated motion controller data.
SimulatedMotionControllerDataProvider
Produces simulated data every frame that defines the position and rotation of the simulated controller.
SourcePoseEventData<T>
Describes a source change event.
SourceStateEventData
Describes an source state event that has a source id.
SpeechEventData
Describes an input event that involves keyword recognition.
SpeechInputHandler
This component handles the speech input events raised form the IMixedRealityInputSystem.
SpherePointer
SpherePointerGrabPoint
SpherePointerInspector
SpherePointerVisual
SpriteCursor
Object that represents a cursor comprised of sprites and colors for each state
TouchHandler
TouchPointer
Touch Pointer Implementation.
TouchScreenDefinition
ViveKnucklesControllerDefinition
ViveWandControllerDefinition
WindowsMixedRealityControllerDefinition
Defines the interactions and data that a Windows Mixed Reality motion controller can provide.
WindowsMixedRealityControllerVisualizer
WindowsMixedRealityHandRecorder
Record joint positions of a hand and log them for use in simulated hands.
XboxControllerDefinition
Defines the base interactions and data that an controller can provide.
Structs
Headset
The headset definition defines the headset as defined by the SDK / Unity.
InputActionEventPair
Data class that maps MixedRealityInputActions to UnityEvents wired up in the inspector.
InputActionRuleDigital
Generic Input Action Rule for raising actions based on specific criteria.
InputActionRuleDualAxis
Generic Input Action Rule for raising actions based on specific criteria.
InputActionRulePoseAxis
Generic Input Action Rule for raising actions based on specific criteria.
InputActionRuleQuaternionAxis
Generic Input Action Rule for raising actions based on specific criteria.
InputActionRuleSingleAxis
Generic Input Action Rule for raising actions based on specific criteria.
InputActionRuleVectorAxis
Generic Input Action Rule for raising actions based on specific criteria.
KeyBinding
Identifier of a key combination or mouse button for generic input binding.
KeywordAndResponse
Keyword/UnityEvent pair that ties voice input to UnityEvents wired up in the inspector.
MeshCursor.MeshCursorDatum
MixedRealityControllerMapping
Used to define a controller or other input device's physical buttons, and other attributes.
MixedRealityControllerVisualizationSetting
Used to define a controller's visualization settings.
MixedRealityGestureMapping
Data structure for mapping gestures to MixedRealityInputActions that can be raised by the Input System.
MixedRealityInputAction
An Input Action for mapping an action to an Input Sources Button, Joystick, Sensor, etc.
MixedRealityInputDataProviderConfiguration
MixedRealityInteractionMappingLegacyInput
Represents the subset of data held by a MixedRealityInteractionMapping that represents Unity's legacy input system.
MixedRealityRaycastHit
The resulting hit information from an IMixedRealityRaycastProvider.
ObjectCursor.ObjectCursorDatum
PointerOption
Defines a pointer option to assign to a controller.
SimulatedMotionControllerButtonState
Struct storing the states of buttons on the motion controller
SpeechCommands
Data structure for mapping Voice and Keyboard input to MixedRealityInputActions that can be raised by the Input System.
SpriteCursor.SpriteCursorDatum
Interfaces
ICursorModifier
Interface for cursor modifiers that can modify a GameObject's properties.
IHandRay
Interface defining a hand ray, which is used by far pointers to direct interactions. Implementations of this class are managed and updated by a BaseHand implementation.
IInputActionRule<T>
Interface for defining Input Action Rules
IInputSimulationService
IMixedRealityBaseInputHandler
Base interface for all input handlers. This allows us to use ExecuteEvents.ExecuteHierarchy<IMixedRealityBaseInputHandler> to send an event to all input handling interfaces.
IMixedRealityController
Mixed Reality Toolkit controller definition, used to manage a specific controller type
IMixedRealityControllerPoseSynchronizer
Basic interface for synchronizing to a controller pose.
IMixedRealityControllerVisualizer
IMixedRealityCursor
Cursor Interface for handling input events and setting visibility.
IMixedRealityDictationHandler
Interface to implement dictation events.
IMixedRealityDictationSystem
Mixed Reality Toolkit controller definition, used to manage a specific controller type
IMixedRealityEyeGazeDataProvider
Provides eye tracking information.
IMixedRealityEyeGazeProvider
Implements the Gaze Provider for an Input Source.
IMixedRealityEyeSaccadeProvider
Provides eye tracking saccade events.
IMixedRealityFocusChangedHandler
Interface to implement to react to focus changed events.
IMixedRealityFocusHandler
Interface to implement to react to focus enter/exit.
IMixedRealityFocusProvider
Implements the Focus Provider for handling focus of pointers.
IMixedRealityGazeProvider
Implements the Gaze Provider for an Input Source.
IMixedRealityGazeProviderHeadOverride
Adds ability to override head gaze on a gaze provider.
IMixedRealityGestureHandler
Interface to implement for generic gesture input.
IMixedRealityGestureHandler<T>
Interface to implement for generic gesture input.
IMixedRealityHand
Hand definition, used to provide access to hand joints and other data.
IMixedRealityHandJointHandler
Interface to implement for hand joint information.
IMixedRealityHandJointService
Mixed Reality Toolkit device definition, used to instantiate and manage a specific device / SDK
IMixedRealityHandMeshHandler
Interface to implement for hand mesh information.
IMixedRealityHandVisualizer
Hand visualization definition, used to provide access to hand joint objects.
IMixedRealityInputActionHandler
Interface to receive input action events.
IMixedRealityInputDeviceManager
Mixed Reality Toolkit input device definition, used to instantiate and manage one or more input devices
IMixedRealityInputHandler
Interface to implement for simple generic input.
IMixedRealityInputHandler<T>
Interface to implement for more complex generic input.
IMixedRealityInputPlaybackService
Plays back input animation via the input simulation system.
IMixedRealityInputRecordingService
Provides input recording into an internal buffer and exporting to files.
IMixedRealityInputSource
Interface for an input source. An input source is the origin of user input and generally comes from a physical controller, sensor, or other hardware device.
IMixedRealityInputSourceDefinition
IMixedRealityInputSystem
Manager interface for a Input system in the Mixed Reality Toolkit All replacement systems for providing Input System functionality should derive from this interface
IMixedRealityMouseDeviceManager
Interface defining a mouse input device manager.
IMixedRealityMousePointer
Interface for handling mouse pointers.
IMixedRealityNearPointer
IMixedRealityPointer
Interface for handling pointers.
IMixedRealityPointerHandler
Interface to implement to react to simple pointer input.
IMixedRealityPointerMediator
Interface for handling groups of pointers resolving conflicts between them. E.g., ensuring that far pointers are disabled when a near pointer is active.
IMixedRealityPrimaryPointerSelector
Interface used by the focus provider to select the pointer that will be considered as primary. The current primary pointer can we obtained via PrimaryPointer or subscribing to the primary pointer changed event via SubscribeToPrimaryPointerChanged(PrimaryPointerChangedHandler, Boolean).
IMixedRealityRaycastProvider
Interface to handle raycasts into the scene. Used by FocusProvider to perform ray and sphere cast queries for pointers.
IMixedRealitySourcePoseHandler
Interface to implement to react to source
IMixedRealitySourceStateHandler
Interface to implement to react to source state changes, such as when an input source is detected or lost.
IMixedRealitySpeechHandler
Interface to implement to react to speech recognition.
IMixedRealitySpeechSystem
Mixed Reality Toolkit controller definition, used to manage a specific controller type
IMixedRealityTeleportPointer
IMixedRealityTouchHandler
Implementation of this interface causes a script to receive notifications of Touch events from HandTrackingInputSources
IMixedRealityTouchPointer
Interface for handling touch pointers.
IPointerPreferences
Provides interface for getting and setting behaviors and possible other settings for pointers in the input system. Behaviors are described based on pointer type and input type, not per pointer. This is to ensure that new pointers that appear maintain consistent behavior.
IPointerResult
Interface defining a pointer result.
Enums
ControllerSimulationMode
Defines for how input simulation handles controllers
CursorContextEnum
Enum for current cursor context
CursorContextInfo.CursorAction
CursorStateEnum
Enum for current cursor state
DeviceInputType
The InputType defines the types of input exposed by a controller. Denoting the available buttons / interactions that a controller supports.
EyeGazeSimulationMode
Defines for how input simulation handles eye gaze
GestureInputType
The GestureInputType defines the types of gestures exposed by a controller.
HandSimulationMode
Defines for how input simulation handles controllers
InputSimulationControlMode
Defines for how input simulation handles movement
InputSimulationWindow.ToolMode
InputSourceType
The InputSourceType defines the types of input sources.
KeyBinding.KeyType
The type of value encoded in the Microsoft.MixedReality.Toolkit.Input.KeyBinding.code property.
KeyBinding.MouseButton
Enum for interpreting the mouse button integer index.
MixedRealityControllerConfigurationFlags
Flags used by MixedRealityControllerAttribute.
MixedRealityControllerInfo.ControllerElementEnum
PointerBehavior
Specifies how a pointer in MRTK's default input system behaves.
SupportedControllerType
The SDKType lists the XR SDKs that are supported by the Mixed Reality Toolkit. Initially, this lists proposed SDKs, not all may be implemented at this time (please see ReleaseNotes for more details)
TouchableEventType
Type of Events to receive from a PokePointer.
Delegates
PrimaryPointerChangedHandler
Delegate type used to handle primary pointer changes. Old and new pointer values can be null to indicate transition from or to no primary pointer, but they won't both be null simultaneously.
SimulatedHandData.HandJointDataGenerator
Generator function producing joint positions and rotations
SimulatedMotionControllerData.MotionControllerPoseUpdater
Delegate to function updating the position and rotation of the motion controller