Class RemoteAudioTrack
Audio track receiving audio frames from the remote peer.
Implements
Namespace: Microsoft.MixedReality.WebRTC
Assembly: Microsoft.MixedReality.WebRTC.dll
Syntax
public class RemoteAudioTrack : MediaTrack, IAudioSource
Remarks
Instances of this class are created by PeerConnection when a negotiation adds tracks sent by the remote peer.
New tracks are automatically played on the system audio device after AudioTrackAdded is fired on track creation. To avoid the track being played, call OutputToDevice(Boolean) in a AudioTrackAdded handler (or later).
Properties
| Improve this Doc View SourceEnabled
Enabled status of the track. If enabled, receives audio frames from the remote peer as expected. If disabled, does not receive anything (silence).
Declaration
public bool Enabled { get; }
Property Value
Type | Description |
---|---|
Boolean |
Remarks
Reading the value of this property after the track has been disposed is valid, and returns
false
.
The remote audio track enabled status is controlled by the remote peer only.
Methods
| Improve this Doc View SourceCreateReadBuffer()
Starts buffering the audio frames from in an AudioTrackReadBuffer.
Declaration
public AudioTrackReadBuffer CreateReadBuffer()
Returns
Type | Description |
---|---|
AudioTrackReadBuffer |
Remarks
WebRTC audio tracks produce an audio frame every 10 ms. If you want the audio frames to be buffered (and optionally resampled) automatically, and you want the application to control when new audio data is read, create an AudioTrackReadBuffer using CreateReadBuffer(). If you want to process the audio frames as soon as they are received, without conversions, subscribe to AudioFrameReady instead.
IsOutputToDevice()
Returns whether the track is output directly to the system audio device.
Declaration
public bool IsOutputToDevice()
Returns
Type | Description |
---|---|
Boolean |
OutputToDevice(Boolean)
Output the audio track to the WebRTC audio device.
Declaration
public void OutputToDevice(bool output)
Parameters
Type | Name | Description |
---|---|---|
Boolean | output |
Remarks
The default behavior is for every remote audio frame to be passed to remote audio frame callbacks, as well as output automatically to the audio device used by WebRTC. If |false| is passed to this function, remote audio frames will still be received and passed to callbacks, but won't be output to the audio device.
NOTE: Changing the default behavior is not supported on UWP.
ToString()
Declaration
public override string ToString()
Returns
Type | Description |
---|---|
String |
Overrides
Events
| Improve this Doc View SourceAudioFrameReady
Event that occurs when a new audio frame is available from the source, either because the source produced it locally (AudioTrackSource, LocalAudioTrack) or because it received it from the remote peer (RemoteAudioTrack).
Declaration
public event AudioFrameDelegate AudioFrameReady
Event Type
Type | Description |
---|---|
AudioFrameDelegate |
Remarks
WebRTC audio tracks produce an audio frame every 10 ms. If you want to process the audio frames as soon as they are received, without conversions, subscribe to AudioFrameReady. If you want the audio frames to be buffered (and optionally resampled) automatically, and you want the application to control when new audio data is read, create an AudioTrackReadBuffer using CreateReadBuffer().