MixedReality-WebRTC MixedReality-WebRTC
Search Results for

    Show / Hide Table of Contents

    Namespace Microsoft.MixedReality.WebRTC

    Classes

    Argb32VideoFrameStorage

    Storage for a video frame encoded in ARGB format.

    AudioTrackReadBuffer

    High level interface for consuming WebRTC audio tracks. Enqueues audio frames for a RemoteAudioTrack in an internal buffer as they arrive. Users should call Read(Int32, Int32, Single[], out Int32, out Boolean, AudioTrackReadBuffer.PadBehavior) to read samples from the buffer when needed.

    AudioTrackSource

    Audio source for WebRTC audio tracks.

    The audio source is not bound to any peer connection, and can therefore be shared by multiple audio tracks from different peer connections. This is especially useful to share local audio capture devices (microphones) amongst multiple peer connections when building a multi-peer experience with a mesh topology (one connection per pair of peers).

    The user owns the audio track source, and is in charge of keeping it alive until after all tracks using it are destroyed, and then dispose of it. The behavior of disposing of the track source while a track is still using it is undefined. The Tracks property contains the list of tracks currently using the source.

    BufferTooSmallException

    Exception raised when a buffer is too small to perform the current operation.

    Generally the buffer was provided by the caller, and this indicates that the caller must provide a larger buffer.

    DataChannel

    Encapsulates a data channel of a peer connection.

    A data channel is a "pipe" allowing to send and receive arbitrary data to the remote peer. Data channels are based on DTLS-SRTP, and are therefore secure (encrypted). Exact security guarantees are provided by the underlying WebRTC core implementation and the WebRTC standard itself.

    https://tools.ietf.org/wg/rtcweb/ https://www.w3.org/TR/webrtc/

    An instance of DataChannel is created either by manually calling or one of its variants, or automatically by the implementation when a new data channel is created in-band by the remote peer (DataChannelAdded). DataChannel cannot be instantiated directly.

    DataChannelNotOpenException

    Exception thrown when trying to use a data channel that is not open.

    The user should listen to the StateChanged event until the State property is Open before trying to send some message with SendMessage(Byte[]).

    DeviceAudioTrackSource

    Implementation of an audio track source producing frames captured from an audio capture device (microphone).

    DeviceVideoTrackSource

    Implementation of a video track source producing frames captured from a video capture device (webcam).

    ExternalVideoTrackSource

    Video source for WebRTC video tracks based on a custom source of video frames managed by the user and external to the WebRTC implementation.

    This class is used to inject into the WebRTC engine a video track whose frames are produced by a user-managed source the WebRTC engine knows nothing about, like programmatically generated frames, including frames not strictly of video origin like a 3D rendered scene, or frames coming from a specific capture device not supported natively by WebRTC. This class serves as an adapter for such video frame sources.

    I420AVideoFrameStorage

    Storage for a video frame encoded in I420+Alpha format.

    IceCandidate

    ICE candidate to send to a remote peer or received from it.

    IceServer

    ICE server configuration (STUN and/or TURN).

    InvalidInteropNativeHandleException

    Exception thrown when an API function expects an interop handle to a valid native object, but receives an invalid handle instead.

    Library

    Container for library-wise global settings of MixedReality-WebRTC.

    LocalAudioDeviceInitConfig

    Configuration to initialize capture on a local audio device (microphone).

    LocalAudioTrack

    Audio track sending to the remote peer audio frames originating from a local track source (local microphone or other audio recording device).

    LocalAudioTrackInitConfig

    Settings for adding a local audio track backed by a local audio capture device (e.g. microphone).

    LocalMediaTrack

    Base class for media tracks sending to the remote peer.

    LocalVideoDeviceInitConfig

    Configuration to initialize capture on a local video device (webcam).

    LocalVideoTrack

    Video track sending to the remote peer video frames originating from a local track source.

    LocalVideoTrackInitConfig

    Settings for creating a new local video track.

    Logging

    Logging utilities.

    MediaTrack

    Base class for media tracks sending to or receiving from the remote peer.

    MovingAverage

    Utility to manage a moving average of a time series.

    PeerConnection

    The WebRTC peer connection object is the entry point to using WebRTC.

    PeerConnection.StatsReport

    Snapshot of the statistics relative to a peer connection/track. The various stats objects can be read through GetStats<T>().

    PeerConnectionConfiguration

    Configuration to initialize a PeerConnection.

    RemoteAudioTrack

    Audio track receiving audio frames from the remote peer.

    RemoteVideoTrack

    Video track receiving video frames from the remote peer.

    SctpNotNegotiatedException

    Exception thrown when trying to add a data channel to a peer connection after a connection to a remote peer was established without an SCTP handshake. When using data channels, at least one data channel must be added to the peer connection before calling CreateOffer() to signal to the implementation the intent to use data channels and the need to perform a SCTP handshake during the connection.

    SdpMessage

    SDP message passed between the local and remote peers via the user's signaling solution.

    TaskExtensions

    Collection of extension methods for Task.

    Transceiver

    Transceiver of a peer connection.

    A transceiver is a media "pipe" connecting the local and remote peers, and used to transmit media data (audio or video) between the peers. The transceiver has a media flow direction indicating whether it is sending and/or receiving any media, or is inactive. When sending some media, the transceiver's local track is used as the source of that media. Conversely, when receiving some media, that media is delivered to the remote media track of the transceiver. As a convenience, the local track can be null if the local peer does not have anything to send. In that case some empty media is automatically sent instead (black frames for video, silence for audio) at very reduced rate. To completely stop sending, the media direction must be changed instead.

    Transceivers are owned by the peer connection which creates them, and cannot be destroyed nor removed from the peer connection. They become invalid when the peer connection is closed, and should not be used after that.

    TransceiverInitSettings

    Settings to create a new transceiver wrapper.

    VideoFrameQueue<T>

    Small queue of video frames received from a source and pending delivery to a sink. Used as temporary buffer between the WebRTC callback (push model) and the video player rendering (pull model). This also handles dropping frames when the source is faster than the sink, by limiting the maximum queue length.

    VideoProfile

    Video profile.

    VideoTrackSource

    Video source for WebRTC video tracks.

    The video source is not bound to any peer connection, and can therefore be shared by multiple video tracks from different peer connections. This is especially useful to share local video capture devices (microphones) amongst multiple peer connections when building a multi-peer experience with a mesh topology (one connection per pair of peers).

    The user owns the video track source, and is in charge of keeping it alive until after all tracks using it are destroyed, and then dispose of it. The behavior of disposing of the track source while a track is still using it is undefined. The Tracks property contains the list of tracks currently using the source.

    Structs

    Argb32VideoFrame

    Single video frame encoded in ARGB interleaved format (32 bits per pixel).

    The ARGB components are in the order of a little endian 32-bit integer, so 0xAARRGGBB, or (B, G, R, A) as a sequence of bytes in memory with B first and A last.

    AudioFrame

    Single raw uncompressed audio frame.

    FrameRequest

    Request sent to an external video source via its registered callback to generate a new video frame for the track(s) connected to it.

    I420AVideoFrame

    Single video frame encoded in I420A format (triplanar YUV with optional alpha plane). See e.g. https://wiki.videolan.org/YUV/#I420 for details.

    The I420 format uses chroma downsampling in both directions, resulting in 12 bits per pixel. With the optional alpha plane, the size increases to 20 bits per pixel.

    PeerConnection.AudioReceiverStats

    Subset of RTCMediaStreamTrack (audio receiver) and RTCInboundRTPStreamStats. See https://www.w3.org/TR/webrtc-stats/#aststats-dict* and https://www.w3.org/TR/webrtc-stats/#inboundrtpstats-dict*.

    PeerConnection.AudioSenderStats

    Subset of RTCMediaStreamTrack (audio sender) and RTCOutboundRTPStreamStats. See https://www.w3.org/TR/webrtc-stats/#raststats-dict* and https://www.w3.org/TR/webrtc-stats/#sentrtpstats-dict*.

    PeerConnection.DataChannelStats

    Subset of RTCDataChannelStats. See https://www.w3.org/TR/webrtc-stats/#dcstats-dict*

    PeerConnection.H264Config

    Configuration for the Media Foundation H.264 encoder.

    PeerConnection.TransportStats

    Subset of RTCTransportStats. See https://www.w3.org/TR/webrtc-stats/#transportstats-dict*.

    PeerConnection.VideoReceiverStats

    Subset of RTCMediaStreamTrack (video receiver) + RTCInboundRTPStreamStats. See https://www.w3.org/TR/webrtc-stats/#rvststats-dict* and https://www.w3.org/TR/webrtc-stats/#inboundrtpstats-dict*

    PeerConnection.VideoSenderStats

    Subset of RTCMediaStreamTrack (video sender) and RTCOutboundRTPStreamStats. See https://www.w3.org/TR/webrtc-stats/#vsstats-dict* and https://www.w3.org/TR/webrtc-stats/#sentrtpstats-dict*.

    VideoCaptureDevice

    Identifier for a video capture device.

    VideoCaptureFormat

    Capture format for a video track.

    Interfaces

    IAudioSource

    Interface for audio sources, whether local sources/tracks or remote tracks.

    ILogSink

    Interface for a sink receiving log messages. The sink can be registered with AddSink(ILogSink, LogSeverity) to receive logging messages.

    IVideoFrameQueue

    Interface for a queue of video frames.

    IVideoFrameStorage

    Interface for a storage of a single video frame.

    IVideoSource

    Interface for video sources, whether local or remote.

    Enums

    AudioDeviceModule

    Audio device module for Windows Desktop platform.

    AudioTrackReadBuffer.PadBehavior

    Controls the padding behavior of Read(Int32, Int32, Single[], out Int32, out Boolean, AudioTrackReadBuffer.PadBehavior) on underrun.

    BundlePolicy

    Bundle policy. See https://www.w3.org/TR/webrtc/#rtcbundlepolicy-enum.

    DataChannel.ChannelState

    Connection state of a data channel.

    DataChannel.MessageKind

    Type of message sent or received through the data channel.

    IceConnectionState

    State of an ICE connection.

    IceGatheringState

    State of an ICE gathering process.

    IceTransportType

    Type of ICE candidates offered to the remote peer.

    Library.ShutdownOptionsFlags

    Options for library shutdown.

    LogSeverity

    Log message severity.

    MediaKind

    Type of media track or media transceiver.

    PeerConnection.FrameHeightRoundMode

    Frame height round mode.

    PeerConnection.H264Profile

    H.264 Encoding profile.

    PeerConnection.H264RcMode

    Rate control mode for the Media Foundation H.264. See https://docs.microsoft.com/en-us/windows/win32/medfound/h-264-video-encoder for details.

    PeerConnection.TrackKind

    Kind of WebRTC track.

    SdpMessageType

    Type of SDP message.

    SdpSemantic

    SDP semantic used for (re)negotiating a peer connection.

    Transceiver.Direction

    Direction of the media flowing inside the transceiver.

    VideoEncoding

    Enumeration of video encodings.

    VideoProfileKind

    Kind of video profile. This corresponds to the enum of the API.

    Delegates

    Argb32VideoFrameDelegate

    Delegate used for events when an ARGB-encoded video frame has been produced and is ready for consumption.

    Argb32VideoFrameRequestDelegate

    Callback invoked when the WebRTC pipeline needs an external video source to generate a new video frame for the track(s) it is connected to.

    AudioFrameDelegate

    Delegate used for events when an audio frame has been produced and is ready for consumption.

    DataChannel.BufferingChangedDelegate

    Delegate for the BufferingChanged event.

    I420AVideoFrameDelegate

    Delegate used for events when an I420-encoded video frame has been produced and is ready for consumption.

    I420AVideoFrameRequestDelegate

    Callback invoked when the WebRTC pipeline needs an external video source to generate a new video frame for the track(s) it is connected to.

    PeerConnection.AudioTrackAddedDelegate

    Delegate for AudioTrackAdded event.

    PeerConnection.AudioTrackRemovedDelegate

    Delegate for AudioTrackRemoved event.

    PeerConnection.DataChannelAddedDelegate

    Delegate for DataChannelAdded event.

    PeerConnection.DataChannelRemovedDelegate

    Delegate for DataChannelRemoved event.

    PeerConnection.IceCandidateReadytoSendDelegate

    Delegate for the IceCandidateReadytoSend event.

    PeerConnection.IceGatheringStateChangedDelegate

    Delegate for the IceGatheringStateChanged event.

    PeerConnection.IceStateChangedDelegate

    Delegate for the IceStateChanged event.

    PeerConnection.LocalSdpReadyToSendDelegate

    Delegate for LocalSdpReadytoSend event.

    PeerConnection.TransceiverAddedDelegate

    Delegate for TransceiverAdded event.

    PeerConnection.VideoTrackAddedDelegate

    Delegate for VideoTrackAdded event.

    PeerConnection.VideoTrackRemovedDelegate

    Delegate for VideoTrackRemoved event.

    TransceiverAssociatedDelegate

    Delegate for the Associated event.

    TransceiverDirectionChangedDelegate

    Delegate for the DirectionChanged event.

    In This Article
    • Classes
    • Structs
    • Interfaces
    • Enums
    • Delegates
    Back to top Generated by DocFX