Class ChatCompletionsClientAgent
- Namespace
- Auto
Gen .AzureAIInference
- Assembly
- AutoGen.AzureAIInference.dll
ChatCompletions client agent. This agent is a thin wrapper around Azure.
Chat
- MessageEnvelope<T> where T is Azure.
AI. : chat request message.Inference. Chat Request Message
Chat
- MessageEnvelope<T> where T is Azure.
AI. : chat response message. MessageEnvelope<T> where T is Azure.Inference. Chat Response Message AI. : streaming chat completions update.Inference. Streaming Chat Completions Update
- Inheritance
-
Chat
Completions Client Agent
- Implements
- Inherited Members
- Extension Methods
Constructors
ChatCompletionsClientAgent(ChatCompletionsClient, string, ChatCompletionsOptions, string)
Create a new instance of Chat
public ChatCompletionsClientAgent(ChatCompletionsClient chatCompletionsClient, string name, ChatCompletionsOptions options, string systemMessage = "You are a helpful AI assistant")
Parameters
chatCompletionsClient
ChatCompletions Client chat completions client
name
stringagent name
options
ChatCompletions Options chat completion option. The option can't contain messages
systemMessage
stringsystem message
ChatCompletionsClientAgent(ChatCompletionsClient, string, string, string, float, int, int?, ChatCompletionsResponseFormat?, IEnumerable<FunctionDefinition>?)
Create a new instance of Chat
public ChatCompletionsClientAgent(ChatCompletionsClient chatCompletionsClient, string name, string modelName, string systemMessage = "You are a helpful AI assistant", float temperature = 0.7, int maxTokens = 1024, int? seed = null, ChatCompletionsResponseFormat? responseFormat = null, IEnumerable<FunctionDefinition>? functions = null)
Parameters
chatCompletionsClient
ChatCompletions Client chat completions client
name
stringagent name
modelName
stringmodel name. e.g. gpt-turbo-3.5
systemMessage
stringsystem message
temperature
floattemperature
maxTokens
intmax tokens to generated
seed
int?seed to use, set it to enable deterministic output
responseFormat
ChatCompletions Response Format response format, set it to Azure.
AI. to enable json mode.Inference. Chat Completions Response FormatJSON functions
IEnumerable<FunctionDefinition >functions
Properties
Name
Property Value
Methods
GenerateReplyAsync(IEnumerable<IMessage>, GenerateReplyOptions?, CancellationToken)
Generate reply
public Task<IMessage> GenerateReplyAsync(IEnumerable<IMessage> messages, GenerateReplyOptions? options = null, CancellationToken cancellationToken = default)
Parameters
messages
IEnumerable<IMessage>conversation history
options
GenerateReply Options completion option. If provided, it should override existing option if there's any
cancellationToken
CancellationToken
Returns
GenerateStreamingReplyAsync(IEnumerable<IMessage>, GenerateReplyOptions?, CancellationToken)
public IAsyncEnumerable<IMessage> GenerateStreamingReplyAsync(IEnumerable<IMessage> messages, GenerateReplyOptions? options = null, CancellationToken cancellationToken = default)
Parameters
messages
IEnumerable<IMessage>options
GenerateReply Options cancellationToken
CancellationToken