In semantic kernel, a kernel plugin is a collection of kernel functions that can be invoked during LLM calls. Semantic kernel provides a list of built-in plugins, like core plugins, web search plugin and many more. You can also create your own plugins and use them in semantic kernel. Kernel plugins greatly extend the capabilities of semantic kernel and can be used to perform various tasks like web search, image search, text summarization, etc.
AutoGen.SemanticKernel
provides a middleware called KernelGetWeather
function and use it in Open
Note
You can find the complete sample code here
Step 1: add using statement
using AutoGen.Core;
using AutoGen.OpenAI;
using AutoGen.OpenAI.Extension;
using Microsoft.SemanticKernel;
using OpenAI;
Step 2: create plugin
In this step, we create a simple plugin with a single GetWeather
function that takes a location as input and returns the weather information for that location.
var openAIKey = Environment.GetEnvironmentVariable("OPENAI_API_KEY") ?? throw new Exception("Please set OPENAI_API_KEY environment variable.");
var modelId = "gpt-4o-mini";
var kernelBuilder = Kernel.CreateBuilder();
var kernel = kernelBuilder.Build();
var getWeatherFunction = KernelFunctionFactory.CreateFromMethod(
method: (string location) => $"The weather in {location} is 75 degrees Fahrenheit.",
functionName: "GetWeather",
description: "Get the weather for a location.");
var plugin = kernel.CreatePluginFromFunctions("my_plugin", [getWeatherFunction]);
Step 3: create OpenAIChatAgent and use the plugin
In this step, we firstly create a KernelKernelPluginMiddleware
will load the plugin and make the functions available for use in other agents. Followed by creating an OpenKernelPluginMiddleware
.
// Create a middleware to handle the plugin functions
var kernelPluginMiddleware = new KernelPluginMiddleware(kernel, plugin);
var openAIClient = new OpenAIClient(openAIKey);
var openAIAgent = new OpenAIChatAgent(
chatClient: openAIClient.GetChatClient(modelId),
name: "assistant")
.RegisterMessageConnector() // register message connector so it support AutoGen built-in message types like TextMessage.
.RegisterMiddleware(kernelPluginMiddleware) // register the middleware to handle the plugin functions
.RegisterPrintMessage(); // pretty print the message to the console
Step 4: chat with OpenAIChatAgent
In this final step, we start the chat with the OpenOpenAIChatAgent
will use the GetWeather
function from the plugin to get the weather information for Seattle.
var toolAggregateMessage = await openAIAgent.SendAsync("Tell me the weather in Seattle");
// The aggregate message will be converted to [ToolCallMessage, ToolCallResultMessage] when flowing into the agent
// send the aggregated message to llm to generate the final response
var finalReply = await openAIAgent.SendAsync(toolAggregateMessage);