Interface IAudioSource
Interface for audio sources, whether local sources/tracks or remote tracks.
Namespace: Microsoft.MixedReality.WebRTC
Assembly: Microsoft.MixedReality.WebRTC.dll
Syntax
public interface IAudioSource
Properties
| Improve this Doc View SourceEnabled
Enabled status of the source. If enabled, produces audio frames as expected. If disabled, produces silence instead.
Declaration
bool Enabled { get; }
Property Value
Type | Description |
---|---|
Boolean |
Methods
| Improve this Doc View SourceCreateReadBuffer()
Starts buffering the audio frames from in an Audio
Declaration
AudioTrackReadBuffer CreateReadBuffer()
Returns
Type | Description |
---|---|
Audio |
Remarks
WebRTC audio tracks produce an audio frame every 10 ms.
If you want the audio frames to be buffered (and optionally resampled) automatically,
and you want the application to control when new audio data is read, create an
Audio
Events
| Improve this Doc View SourceAudioFrameReady
Event that occurs when a new audio frame is available from the source, either
because the source produced it locally (Audio
Declaration
event AudioFrameDelegate AudioFrameReady
Event Type
Type | Description |
---|---|
Audio |
Remarks
WebRTC audio tracks produce an audio frame every 10 ms.
If you want to process the audio frames as soon as they are received, without conversions,
subscribe to Audio