Class LocalAudioTrack
Audio track sending to the remote peer audio frames originating from a local track source (local microphone or other audio recording device).
Namespace: Microsoft.MixedReality.WebRTC
Assembly: Microsoft.MixedReality.WebRTC.dll
Syntax
public class LocalAudioTrack : LocalMediaTrack, IAudioSource
Properties
| Improve this Doc View SourceEnabled
Enabled status of the track. If enabled, send local audio frames to the remote peer as expected. If disabled, send only black frames instead.
Declaration
public bool Enabled { get; set; }
Property Value
Type | Description |
---|---|
Boolean |
Remarks
Reading the value of this property after the track has been disposed is valid, and returns
false
. Writing to this property after the track has been disposed throws an exception.
Source
Audio track source this track is pulling its audio frames from.
Declaration
public AudioTrackSource Source { get; }
Property Value
Type | Description |
---|---|
Audio |
Methods
| Improve this Doc View SourceCreateFromSource(AudioTrackSource, LocalAudioTrackInitConfig)
Create an audio track from an existing audio track source.
This does not add the track to any peer connection. Instead, the track must be added manually to an audio transceiver to be attached to a peer connection and transmitted to a remote peer.
Declaration
public static LocalAudioTrack CreateFromSource(AudioTrackSource source, LocalAudioTrackInitConfig initConfig)
Parameters
Type | Name | Description |
---|---|---|
Audio |
source | The track source which provides the raw audio frames to the newly created track. |
Local |
initConfig | Configuration to initialize the track being created. |
Returns
Type | Description |
---|---|
Local |
Asynchronous task completed once the track is created. |
CreateReadBuffer()
Starts buffering the audio frames from in an Audio
Declaration
public AudioTrackReadBuffer CreateReadBuffer()
Returns
Type | Description |
---|---|
Audio |
Remarks
WebRTC audio tracks produce an audio frame every 10 ms.
If you want the audio frames to be buffered (and optionally resampled) automatically,
and you want the application to control when new audio data is read, create an
Audio
Dispose()
Remove the track from the associated Transceiver (if there is one) and release the corresponding resources.
Declaration
public override void Dispose()
Overrides
| Improve this Doc View SourceToString()
Declaration
public override string ToString()
Returns
Type | Description |
---|---|
String |
Overrides
Events
| Improve this Doc View SourceAudioFrameReady
Event that occurs when a new audio frame is available from the source, either
because the source produced it locally (Audio
Declaration
public event AudioFrameDelegate AudioFrameReady
Event Type
Type | Description |
---|---|
Audio |
Remarks
WebRTC audio tracks produce an audio frame every 10 ms.
If you want to process the audio frames as soon as they are received, without conversions,
subscribe to Audio