Interface IAudioSource
Interface for audio sources, whether local sources/tracks or remote tracks.
Namespace: Microsoft.MixedReality.WebRTC
Assembly: Microsoft.MixedReality.WebRTC.dll
Syntax
public interface IAudioSource
Properties
| Improve this Doc View SourceEnabled
Enabled status of the source. If enabled, produces audio frames as expected. If disabled, produces silence instead.
Declaration
bool Enabled { get; }
Property Value
Type | Description |
---|---|
Boolean |
Methods
| Improve this Doc View SourceCreateReadBuffer()
Starts buffering the audio frames from in an AudioTrackReadBuffer.
Declaration
AudioTrackReadBuffer CreateReadBuffer()
Returns
Type | Description |
---|---|
AudioTrackReadBuffer |
Remarks
WebRTC audio tracks produce an audio frame every 10 ms. If you want the audio frames to be buffered (and optionally resampled) automatically, and you want the application to control when new audio data is read, create an AudioTrackReadBuffer using CreateReadBuffer(). If you want to process the audio frames as soon as they are received, without conversions, subscribe to AudioFrameReady instead.
Events
| Improve this Doc View SourceAudioFrameReady
Event that occurs when a new audio frame is available from the source, either because the source produced it locally (AudioTrackSource, LocalAudioTrack) or because it received it from the remote peer (RemoteAudioTrack).
Declaration
event AudioFrameDelegate AudioFrameReady
Event Type
Type | Description |
---|---|
AudioFrameDelegate |
Remarks
WebRTC audio tracks produce an audio frame every 10 ms. If you want to process the audio frames as soon as they are received, without conversions, subscribe to AudioFrameReady. If you want the audio frames to be buffered (and optionally resampled) automatically, and you want the application to control when new audio data is read, create an AudioTrackReadBuffer using CreateReadBuffer().