releases/2.5.4mrtk_developmentreleases/2.1.0releases/2.2.0releases/2.3.0releases/2.4.0releases/2.5.0releases/2.5.1releases/2.5.2releases/2.5.3
  • Guides
  • API Documentation

We've moved!

Starting from MRTK 2.6, we are publishing both conceptual docs and API references on docs.microsoft.com. For conceptual docs, please visit our new landing page. For API references, please visit the MRTK-Unity section of the dot net API explorer. Existing content will remain here but will not be updated further.

  • Guides
  • Contributing
  • Feature Contribution Process

    Show / Hide Table of Contents
    • Getting Started with MRTK
      • Upgrading from HTK
      • Updating from RC2
      • Release Notes
      • Building and Deploying MRTK
      • Performance
      • Hologram Stabilization
    • Architecture
      • Overview
      • Framework and Runtime
      • Input System
        • Terminology
        • Core System
        • Controllers, Pointers, and Focus
    • Feature Overviews
      • Profiles
      • Profiles Configuration
      • Input
        • Input Overview
        • Input Providers
        • Input Events
        • Input Actions
        • Controllers
        • Pointers
        • Gestures
        • Speech(Voice command)
        • Dictation
        • Hands
        • Gaze
        • Eyes
      • In-Editor Input Simulation
      • UX Building Blocks
        • Interactable
        • Button
        • Bounding Box
        • Object Manipulation
        • Sliders
        • Fingertip Visualization
        • App Bar
        • Object Collection
        • Slate
        • System Keyboard
        • Tooltips
        • Solvers
        • Hand Interaction Example
        • Eye Tracking Interaction Example
      • Detecting Platform Capabilities
      • MRTK Standard Shader
      • Spatial Awareness
        • Spatial Awareness Overview
        • Configuring the Spatial Awareness Mesh Observer
        • Spatial Object Mesh Observer
        • Usage Guide
      • Multi Scene System
        • Multi Scene System Overview
        • Scene Types
        • Content Scene Loading
        • Monitoring Content Loading
        • Lighting Scene Operations
      • Teleport System
      • Boundary System
        • Boundary System Overview
        • Configuring the Boundary Visualization
      • Diagnostics System
        • Diagnostics System Overview
        • Configuring the Diagnostics System
        • Using the Visual Profiler
      • Services
        • What makes a mixed reality feature
        • What are the MixedRealityServiceRegistry and IMixedRealityServiceRegistrar
      • Packages
        • MRTK Packages
        • MRTK Componentization
      • Tools
        • Dependency Window
        • Optimize Window
        • Input Animation Recording
          • Input Animation File Format Specification
        • Extension Service Creation Wizard
      • Scene Transition Service
      • Experimental Features
    • Contributing
      • Contributing Overview
      • Feature Contribution Process
      • Coding Guidelines
      • Documentation guide
      • Developer portal generation with DocFX
      • Testing
      • Breaking Changes
    • Planning
      • Roadmap
    • Notice
    • Authors

    Feature Contribution Process

    Adding features to the Mixed Reality Toolkit (MRTK) is split up into a few iteration steps, so maintainers can have time to review and and ensure the process goes smoothly. Please be sure to review the list of feature requirements before you get started.

    Process

    The following process has been drafted to ensure all new work complies to the updated standards and architecture defined for the MRTK, this has been defined as:

    1. Open a new Proposal and related Tasks
    2. Submit an Architecture Draft or Outline
    3. Review and finalize the Architecture documentation
    4. Submit a PR implementing the Core feature interfaces and event datum (if applicable)
    5. Submit a PR Implementing any required SDK components
    6. Submit a PR Implementing feature demos or full scale Examples

    New Proposal

    Start by opening a new Proposal or Task describing the feature or the problem you want to solve. Describe the approach and how it fits into the version of the Mixed Reality Toolkit you're targeting. This will enable everyone have a discussion about the proposal and, hopefully, identify some potential pitfalls before any work is started.

    New Proposals will be reviewed and discussed during our weekly ship room meetings and if a proposal is accepted, supplemental tasks will then be created and assigned.

    Architecture Draft

    The first task once the initial proposal has been accepted, will be to draft the initial architecture document for the feature or work to be done. This document should typically be one or two pages long and include a high level overview of the feature and how it will relate to other parts of the Mixed Reality Toolkit.

    • The draft must be easy to consume with key areas highlighted.
    • The draft must include a list of the proposed core interfaces, configuration profiles, and event datum.
    • The draft must include a simple graphic of the proposed architecture.

    Ensure that the architecture of the feature complies with the New Feature Requirements set out by the Core MRTK architecture.

    TODO: Add link to architecture draft template

    Once the draft is completed, this can be appended to the Proposal / Task issue on GitHub for final public review.

    Architecture Documentation

    Once the draft architecture is accepted, additional pull requests can be made to submit the final full architecture documents to the repository.

    TODO: Add link to the full architecture template

    Once the architecture document is approved, only then can the first code submissions can be made.

    Development can begin in your own private branch and complete as normal, however, the PR's submitted back to the core MRTK project should be submitted in stages to ensure the review and approval is as smooth as it can be (and ensure core changes do not impact other features)

    Core Implementation

    The initial work that should be submitted, is to implement:

    • Definitions
    • Interfaces
    • Configuration profiles
    • Event data

    If needed, the architectural document can be updated to align with any changes to the implementation.

    Please ensure that all existing Unit Tests and any new tests are all passing prior to submission.

    SDK Implementation

    Once the core interfaces and events are merged in to development, work can then be submitted for the SDK components. Adding the concrete implementation of the feature and testing against the supported platforms and unit tests.

    Example Implementation

    Once the SDK components are merged, then any demo scenes or updates to the example scenes can be submitted.

    • Demos are for specific feature highlighting and demonstration
    • Examples are full working scene learning examples

    New Feature Requirements

    Most feature implementations can be broken down into 3 main parts:

    1. The Feature Manager
    2. The Event Data (Optional)
    3. The Feature Handler (Optional)

    Manager Implementation Requirements

    • Assembly Definitions for code outside of the MixedRealityToolkit/_Core folder.
      • This ensures features are self-contained and have no dependencies to other features.
      • This only applies to MixedRealityToolkit folder.
    • Be defined using an interface found in MixedRealityToolkit/_Core/definitions/<FeatureName>System.
    • A feature's concrete manager implementation should inherit directly from BaseManager or MixedRealityEventManager if they will raise events.
    • A feature's concrete manager implementation should setup and verify scene is ready for that system to use in Initialize.
    • A feature's concrete manager should also clean up after themselves removing anything created in the scene in Destroy.
    • Be registered with the Mixed Reality Manager.
      • If the feature is a core feature, this should be hard coded into the MixedRealityManager and added to the MixedRealityConfigurationProfile.
        • This includes being able to specify a concrete implementation via dropdown using SystemType.
        • Features should have a configuration profile that derives from a scriptable object.
        • A default configuration profile located in MixedRealityToolkit-SDK/Profiles and be assigned in the default configuration profile for the Mixed Reality Manager
      • If this feature is not a core feature, then it must be registered using the component configuration profile and implement IMixedRealityComponent.
    • Have a default implementation located in MixedRealityToolkit-SDK/Features/<FeatureName>
    • Events that can be raised with the system should be defined in the interface, with all the required parameters for initializing the event data.

    Event Data Implementation Requirements

    The Event Data defines exactly what data the handler is expected to receive from the event.

    • All Event Datum for the feature should be defined in MixedRealityToolkit/_Core/EventDatum/<FeatureName>.
    • All new Event Data classes should inherit from GenericBaseEventData

    Handler Implementation Requirements

    The Handler Interface defines each event a component should be listening for and the types of data passed. End users will implement the interface to execute logic based on the event data received.

    • Handler interfaces should be defined in MixedRealityToolkit/_Core/Interfaces/<FeatureName>System/Handlers.
    • Handler interfaces should inherit from UnityEngine.EventSystems.IEventSystemHandler
    • Opt-in by default. To receive events from the system, the handler will need to register itself with the system to receive those events.
    • Improve this Doc
    In This Article
    • Feature Contribution Process
    • Process
      • New Proposal
      • Architecture Draft
      • Architecture Documentation
      • Core Implementation
      • SDK Implementation
      • Example Implementation
    • New Feature Requirements
      • Manager Implementation Requirements
      • Event Data Implementation Requirements
      • Handler Implementation Requirements
    Back to top Generated by DocFX