Class LMStudioAgent
agent that consumes local server from LM Studio
[Obsolete("Use OpenAIChatAgent to connect to LM Studio")]
public class LMStudioAgent : IAgent, IAgentMetaInformation
- Inheritance
-
LMStudioAgent
- Implements
- Inherited Members
- Extension Methods
Examples
var endpoint = "http://localhost:1234";
var openaiClient = new OpenAIClient(new ApiKeyCredential("api-key"), new OpenAIClientOptions
{
Endpoint = new Uri(endpoint),
});
var lmAgent = new OpenAIChatAgent(
chatClient: openaiClient.GetChatClient("<does-not-matter>"),
name: "assistant")
.RegisterMessageConnector()
.RegisterPrintMessage();
await lmAgent.SendAsync("Can you write a piece of C# code to calculate 100th of fibonacci?");
// output from assistant (the output below is generated using llama-2-chat-7b, the output may vary depending on the model used)
//
// Of course! To calculate the 100th number in the Fibonacci sequence using C#, you can use the following code:```
// using System;
// class FibonacciSequence {
// static int Fibonacci(int n) {
// if (n <= 1) {
// return 1;
// } else {
// return Fibonacci(n - 1) + Fibonacci(n - 2);
// }
// }
// static void Main() {
// Console.WriteLine("The 100th number in the Fibonacci sequence is: " + Fibonacci(100));
// }
// }
// ```
// In this code, we define a function `Fibonacci` that takes an integer `n` as input and returns the `n`-th number in the Fibonacci sequence. The function uses a recursive approach to calculate the value of the sequence.
// The `Main` method simply calls the `Fibonacci` function with the argument `100`, and prints the result to the console.
// Note that this code will only work for positive integers `n`. If you want to calculate the Fibonacci sequence for other types of numbers, such as real or complex numbers, you will need to modify the code accordingly.
Constructors
LMStudioAgent(string, LMStudioConfig, string, float, int, IEnumerable<FunctionDefinition>?, IDictionary<string, Func<string, Task<string>>>?)
public LMStudioAgent(string name, LMStudioConfig config, string systemMessage = "You are a helpful AI assistant", float temperature = 0.7, int maxTokens = 1024, IEnumerable<FunctionDefinition>? functions = null, IDictionary<string, Func<string, Task<string>>>? functionMap = null)
Parameters
name
stringconfig
LMStudioConfigsystemMessage
stringtemperature
floatmaxTokens
intfunctions
IEnumerable<FunctionDefinition>functionMap
IDictionary<string, Func<string, Task<string>>>
Properties
Name
public string Name { get; }
Property Value
Methods
GenerateReplyAsync(IEnumerable<IMessage>, GenerateReplyOptions?, CancellationToken)
Generate reply
public Task<IMessage> GenerateReplyAsync(IEnumerable<IMessage> messages, GenerateReplyOptions? options = null, CancellationToken cancellationToken = default)
Parameters
messages
IEnumerable<IMessage>conversation history
options
GenerateReplyOptionscompletion option. If provided, it should override existing option if there's any
cancellationToken
CancellationToken