Unity library overview
The Unity library offers a simple way to add real-time communication to an existing Unity application. MixedReality-WebRTC provides a collection of Unity componenents (MonoBehaviour
-derived classes) which encapsulate objects from the underlying C# library, and allow in-editor configuration as well as establishing a connection to a remote peer both in standalone and in Play mode.
- The
PeerConnection
component is the entry point for configuring and establishing a peer-to-peer connection. - The peer connection component makes use of a signaler (generally derived from the
Signaler
base class utility) to handle the SDP messages dispatching. This process continues even after the connection started, as it handles all tracks and transceivers (re-)negotiations, even after a direct peer-to-peer connection for media transport is established. - Audio and video sources capturing from a local audio (microphone) and video (webcam) capture device are handled by the
MicrophoneSource
andWebcamSource
components, respectively. Those sources can be shared with multiple peer connections. - For remote tracks, the
AudioReceiver
andVideoReceiver
respectively handle configuring a remote audio and video track streamed from the remote peer. Unlike the previous track sources, those component encapsulate tracks and are tied with a specific peer connection. - Rendering of both local and remote video sources can be handled by the
VideoRenderer
utility component, which connects to a video renderer source and renders it using a custom shader into a Unity Texture2D object, which can be later applied on a mesh to be rendered in the scene.
Note
The local audio is never played out locally (no local frame callback), only streamed to the remote peer. The local video can be played locally by registering a frame callback with the video track source.