{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# Model Clients\n", "\n", "AutoGen provides the {py:mod}`autogen_core.components.models` module with a suite of built-in\n", "model clients for using ChatCompletion API.\n", "All model clients implement the {py:class}`~autogen_core.components.models.ChatCompletionClient` protocol class." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Built-in Model Clients\n", "\n", "Currently there are two built-in model clients:\n", "{py:class}`~autogen_core.components.models.OpenAIChatCompletionClient` and\n", "{py:class}`~autogen_core.components.models.AzureOpenAIChatCompletionClient`.\n", "Both clients are asynchronous.\n", "\n", "To use the {py:class}`~autogen_core.components.models.OpenAIChatCompletionClient`, you need to provide the API key\n", "either through the environment variable `OPENAI_API_KEY` or through the `api_key` argument." ] }, { "cell_type": "code", "execution_count": 1, "metadata": {}, "outputs": [], "source": [ "from autogen_core.components.models import OpenAIChatCompletionClient, UserMessage\n", "\n", "# Create an OpenAI model client.\n", "model_client = OpenAIChatCompletionClient(\n", " model=\"gpt-4o\",\n", " # api_key=\"sk-...\", # Optional if you have an API key set in the environment.\n", ")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "You can call the {py:meth}`~autogen_core.components.models.OpenAIChatCompletionClient.create` method to create a\n", "chat completion request, and await for an {py:class}`~autogen_core.components.models.CreateResult` object in return." ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "The capital of France is Paris.\n" ] } ], "source": [ "# Send a message list to the model and await the response.\n", "messages = [\n", " UserMessage(content=\"What is the capital of France?\", source=\"user\"),\n", "]\n", "response = await model_client.create(messages=messages)\n", "\n", "# Print the response\n", "print(response.content)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Streaming Response\n", "\n", "You can use the {py:meth}`~autogen_core.components.models.OpenAIChatCompletionClient.create_streaming` method to create a\n", "chat completion request with streaming response." ] }, { "cell_type": "code", "execution_count": 3, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Streamed responses:\n", "In a secluded valley where the sun painted the sky with hues of gold, a solitary dragon named Bremora stood guard. Her emerald scales shimmered with an ancient light as she watched over the village below. Unlike her fiery kin, Bremora had no desire for destruction; her soul was bound by a promise to protect.\n", "\n", "Generations ago, a wise elder had befriended Bremora, offering her companionship instead of fear. In gratitude, she vowed to shield the village from calamity. Years passed, and children grew up believing in the legends of a watchful dragon who brought them prosperity and peace.\n", "\n", "One summer, an ominous storm threatened the valley, with ravenous winds and torrents of rain. Bremora rose into the tempest, her mighty wings defying the chaos. She channeled her breath—not of fire, but of warmth and tranquility—calming the storm and saving her cherished valley.\n", "\n", "When dawn broke and the village emerged unscathed, the people looked to the sky. There, Bremora soared gracefully, a guardian spirit woven into their lives, silently promising her eternal vigilance.\n", "\n", "------------\n", "\n", "The complete response:\n", "In a secluded valley where the sun painted the sky with hues of gold, a solitary dragon named Bremora stood guard. Her emerald scales shimmered with an ancient light as she watched over the village below. Unlike her fiery kin, Bremora had no desire for destruction; her soul was bound by a promise to protect.\n", "\n", "Generations ago, a wise elder had befriended Bremora, offering her companionship instead of fear. In gratitude, she vowed to shield the village from calamity. Years passed, and children grew up believing in the legends of a watchful dragon who brought them prosperity and peace.\n", "\n", "One summer, an ominous storm threatened the valley, with ravenous winds and torrents of rain. Bremora rose into the tempest, her mighty wings defying the chaos. She channeled her breath—not of fire, but of warmth and tranquility—calming the storm and saving her cherished valley.\n", "\n", "When dawn broke and the village emerged unscathed, the people looked to the sky. There, Bremora soared gracefully, a guardian spirit woven into their lives, silently promising her eternal vigilance.\n" ] } ], "source": [ "messages = [\n", " UserMessage(content=\"Write a very short story about a dragon.\", source=\"user\"),\n", "]\n", "\n", "# Create a stream.\n", "stream = model_client.create_stream(messages=messages)\n", "\n", "# Iterate over the stream and print the responses.\n", "print(\"Streamed responses:\")\n", "async for response in stream: # type: ignore\n", " if isinstance(response, str):\n", " # A partial response is a string.\n", " print(response, flush=True, end=\"\")\n", " else:\n", " # The last response is a CreateResult object with the complete message.\n", " print(\"\\n\\n------------\\n\")\n", " print(\"The complete response:\", flush=True)\n", " print(response.content, flush=True)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "```{note}\n", "The last response in the streaming response is always the final response\n", "of the type {py:class}`~autogen_core.components.models.CreateResult`.\n", "```" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Azure OpenAI\n", "\n", "To use the {py:class}`~autogen_core.components.models.AzureOpenAIChatCompletionClient`, you need to provide\n", "the deployment id, Azure Cognitive Services endpoint, api version, and model capabilities.\n", "For authentication, you can either provide an API key or an Azure Active Directory (AAD) token credential.\n", "To use AAD authentication, you need to first install the `azure-identity` package." ] }, { "cell_type": "code", "execution_count": 11, "metadata": { "vscode": { "languageId": "shellscript" } }, "outputs": [], "source": [ "# pip install azure-identity" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The following code snippet shows how to use AAD authentication.\n", "The identity used must be assigned the [**Cognitive Services OpenAI User**](https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/role-based-access-control#cognitive-services-openai-user) role." ] }, { "cell_type": "code", "execution_count": 15, "metadata": {}, "outputs": [], "source": [ "from autogen_core.components.models import AzureOpenAIChatCompletionClient\n", "from azure.identity import DefaultAzureCredential, get_bearer_token_provider\n", "\n", "# Create the token provider\n", "token_provider = get_bearer_token_provider(DefaultAzureCredential(), \"https://cognitiveservices.azure.com/.default\")\n", "\n", "az_model_client = AzureOpenAIChatCompletionClient(\n", " model=\"{your-azure-deployment}\",\n", " api_version=\"2024-06-01\",\n", " azure_endpoint=\"https://{your-custom-endpoint}.openai.azure.com/\",\n", " azure_ad_token_provider=token_provider, # Optional if you choose key-based authentication.\n", " # api_key=\"sk-...\", # For key-based authentication.\n", " model_capabilities={\n", " \"vision\": True,\n", " \"function_calling\": True,\n", " \"json_output\": True,\n", " },\n", ")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "```{note}\n", "See [here](https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/managed-identity#chat-completions) for how to use the Azure client directly or for more info.\n", "```" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Build Agent using Model Client\n", "\n", "Let's create a simple AI agent that can respond to messages using the ChatCompletion API." ] }, { "cell_type": "code", "execution_count": 4, "metadata": {}, "outputs": [], "source": [ "from dataclasses import dataclass\n", "\n", "from autogen_core.application import SingleThreadedAgentRuntime\n", "from autogen_core.base import MessageContext\n", "from autogen_core.components import RoutedAgent, message_handler\n", "from autogen_core.components.models import ChatCompletionClient, OpenAIChatCompletionClient, SystemMessage, UserMessage\n", "\n", "\n", "@dataclass\n", "class Message:\n", " content: str\n", "\n", "\n", "class SimpleAgent(RoutedAgent):\n", " def __init__(self, model_client: ChatCompletionClient) -> None:\n", " super().__init__(\"A simple agent\")\n", " self._system_messages = [SystemMessage(\"You are a helpful AI assistant.\")]\n", " self._model_client = model_client\n", "\n", " @message_handler\n", " async def handle_user_message(self, message: Message, ctx: MessageContext) -> Message:\n", " # Prepare input to the chat completion model.\n", " user_message = UserMessage(content=message.content, source=\"user\")\n", " response = await self._model_client.create(\n", " self._system_messages + [user_message], cancellation_token=ctx.cancellation_token\n", " )\n", " # Return with the model's response.\n", " assert isinstance(response.content, str)\n", " return Message(content=response.content)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The `SimpleAgent` class is a subclass of the\n", "{py:class}`autogen_core.components.RoutedAgent` class for the convenience of automatically routing messages to the appropriate handlers.\n", "It has a single handler, `handle_user_message`, which handles message from the user. It uses the `ChatCompletionClient` to generate a response to the message.\n", "It then returns the response to the user, following the direct communication model.\n", "\n", "```{note}\n", "The `cancellation_token` of the type {py:class}`autogen_core.base.CancellationToken` is used to cancel\n", "asynchronous operations. It is linked to async calls inside the message handlers\n", "and can be used by the caller to cancel the handlers.\n", "```" ] }, { "cell_type": "code", "execution_count": 6, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Seattle is a vibrant city with a wide range of activities and attractions. Here are some fun things to do in Seattle:\n", "\n", "1. **Space Needle**: Visit this iconic observation tower for stunning views of the city and surrounding mountains.\n", "\n", "2. **Pike Place Market**: Explore this historic market where you can see the famous fish toss, buy local produce, and find unique crafts and eateries.\n", "\n", "3. **Museum of Pop Culture (MoPOP)**: Dive into the world of contemporary culture, music, and science fiction at this interactive museum.\n", "\n", "4. **Chihuly Garden and Glass**: Marvel at the beautiful glass art installations by artist Dale Chihuly, located right next to the Space Needle.\n", "\n", "5. **Seattle Aquarium**: Discover the diverse marine life of the Pacific Northwest at this engaging aquarium.\n", "\n", "6. **Seattle Art Museum**: Explore a vast collection of art from around the world, including contemporary and indigenous art.\n", "\n", "7. **Kerry Park**: For one of the best views of the Seattle skyline, head to this small park on Queen Anne Hill.\n", "\n", "8. **Ballard Locks**: Watch boats pass through the locks and observe the salmon ladder to see salmon migrating.\n", "\n", "9. **Ferry to Bainbridge Island**: Take a scenic ferry ride across Puget Sound to enjoy charming shops, restaurants, and beautiful natural scenery.\n", "\n", "10. **Olympic Sculpture Park**: Stroll through this outdoor park with large-scale sculptures and stunning views of the waterfront and mountains.\n", "\n", "11. **Underground Tour**: Discover Seattle's history on this quirky tour of the city's underground passageways in Pioneer Square.\n", "\n", "12. **Seattle Waterfront**: Enjoy the shops, restaurants, and attractions along the waterfront, including the Seattle Great Wheel and the aquarium.\n", "\n", "13. **Discovery Park**: Explore the largest green space in Seattle, featuring trails, beaches, and views of Puget Sound.\n", "\n", "14. **Food Tours**: Try out Seattle’s diverse culinary scene, including fresh seafood, international cuisines, and coffee culture (don’t miss the original Starbucks!).\n", "\n", "15. **Attend a Sports Game**: Catch a Seahawks (NFL), Mariners (MLB), or Sounders (MLS) game for a lively local experience.\n", "\n", "Whether you're interested in culture, nature, food, or history, Seattle has something for everyone to enjoy!\n" ] } ], "source": [ "# Create the runtime and register the agent.\n", "from autogen_core.base import AgentId\n", "\n", "runtime = SingleThreadedAgentRuntime()\n", "await SimpleAgent.register(\n", " runtime,\n", " \"simple_agent\",\n", " lambda: SimpleAgent(\n", " OpenAIChatCompletionClient(\n", " model=\"gpt-4o-mini\",\n", " # api_key=\"sk-...\", # Optional if you have an OPENAI_API_KEY set in the environment.\n", " )\n", " ),\n", ")\n", "# Start the runtime processing messages.\n", "runtime.start()\n", "# Send a message to the agent and get the response.\n", "message = Message(\"Hello, what are some fun things to do in Seattle?\")\n", "response = await runtime.send_message(message, AgentId(\"simple_agent\", \"default\"))\n", "print(response.content)\n", "# Stop the runtime processing messages.\n", "await runtime.stop()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Manage Model Context\n", "\n", "The above `SimpleAgent` always responds with a fresh context that contains only\n", "the system message and the latest user's message.\n", "We can use model context classes from {py:mod}`autogen_core.components.model_context`\n", "to make the agent \"remember\" previous conversations.\n", "A model context supports storage and retrieval of Chat Completion messages.\n", "It is always used together with a model client to generate LLM-based responses.\n", "\n", "For example, {py:mod}`~autogen_core.components.model_context.BufferedChatCompletionContext`\n", "is a most-recent-used (MRU) context that stores the most recent `buffer_size`\n", "number of messages. This is useful to avoid context overflow in many LLMs.\n", "\n", "Let's update the previous example to use\n", "{py:mod}`~autogen_core.components.model_context.BufferedChatCompletionContext`." ] }, { "cell_type": "code", "execution_count": 9, "metadata": {}, "outputs": [], "source": [ "from autogen_core.components.model_context import BufferedChatCompletionContext\n", "from autogen_core.components.models import AssistantMessage\n", "\n", "\n", "class SimpleAgentWithContext(RoutedAgent):\n", " def __init__(self, model_client: ChatCompletionClient) -> None:\n", " super().__init__(\"A simple agent\")\n", " self._system_messages = [SystemMessage(\"You are a helpful AI assistant.\")]\n", " self._model_client = model_client\n", " self._model_context = BufferedChatCompletionContext(buffer_size=5)\n", "\n", " @message_handler\n", " async def handle_user_message(self, message: Message, ctx: MessageContext) -> Message:\n", " # Prepare input to the chat completion model.\n", " user_message = UserMessage(content=message.content, source=\"user\")\n", " # Add message to model context.\n", " await self._model_context.add_message(user_message)\n", " # Generate a response.\n", " response = await self._model_client.create(\n", " self._system_messages + (await self._model_context.get_messages()),\n", " cancellation_token=ctx.cancellation_token,\n", " )\n", " # Return with the model's response.\n", " assert isinstance(response.content, str)\n", " # Add message to model context.\n", " await self._model_context.add_message(AssistantMessage(content=response.content, source=self.metadata[\"type\"]))\n", " return Message(content=response.content)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now let's try to ask follow up questions after the first one." ] }, { "cell_type": "code", "execution_count": 10, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Question: Hello, what are some fun things to do in Seattle?\n", "Response: Seattle offers a wide variety of fun activities and attractions for visitors. Here are some highlights:\n", "\n", "1. **Pike Place Market**: Explore this iconic market, where you can find fresh produce, unique crafts, and the famous fish-throwing vendors. Don’t forget to visit the original Starbucks!\n", "\n", "2. **Space Needle**: Enjoy breathtaking views of the city and Mount Rainier from the observation deck of this iconic structure. You can also dine at the SkyCity restaurant.\n", "\n", "3. **Chihuly Garden and Glass**: Admire the stunning glass art installations created by artist Dale Chihuly. The garden and exhibit are particularly beautiful, especially in good weather.\n", "\n", "4. **Museum of Pop Culture (MoPOP)**: Dive into the world of music, science fiction, and pop culture through interactive exhibits and memorabilia.\n", "\n", "5. **Seattle Aquarium**: Located on the waterfront, the aquarium features a variety of marine life native to the Pacific Northwest, including otters and diving birds.\n", "\n", "6. **Seattle Art Museum (SAM)**: Explore a diverse collection of art from around the world, including Native American art and contemporary pieces.\n", "\n", "7. **Ballard Locks**: Watch boats travel between the Puget Sound and Lake Union, and see salmon navigating the fish ladder during spawning season.\n", "\n", "8. **Fremont Troll**: Visit this quirky public art installation located under the Aurora Bridge, where you can take fun photos with the giant troll.\n", "\n", "9. **Kerry Park**: For a picturesque view of the Seattle skyline, head to Kerry Park on Queen Anne Hill, especially at sunset.\n", "\n", "10. **Take a Ferry Ride**: Enjoy the scenic views while taking a ferry to nearby Bainbridge Island or Vashon Island for a relaxing day trip.\n", "\n", "11. **Underground Tour**: Explore Seattle’s history on an entertaining underground tour in Pioneer Square, where you’ll learn about the city’s early days.\n", "\n", "12. **Attend a Sporting Event**: Depending on the season, catch a Seattle Seahawks (NFL) game, a Seattle Mariners (MLB) game, or a Seattle Sounders (MLS) match.\n", "\n", "13. **Explore Discovery Park**: Enjoy nature with hiking trails, beach access, and stunning views of the Puget Sound and Olympic Mountains.\n", "\n", "14. **West Seattle’s Alki Beach**: Relax at this beach with beautiful views of the Seattle skyline and enjoy beachside activities like biking or kayaking.\n", "\n", "15. **Dining and Craft Beer**: Seattle has a vibrant food scene and is known for its seafood, coffee culture, and craft breweries. Make sure to explore local restaurants and breweries.\n", "\n", "There’s something for everyone in Seattle, whether you’re interested in nature, art, history, or food!\n", "-----\n", "Question: What was the first thing you mentioned?\n", "Response: The first thing I mentioned was **Pike Place Market**, an iconic market in Seattle where you can find fresh produce, unique crafts, and experience the famous fish-throwing vendors. It's also home to the original Starbucks and various charming shops and eateries.\n" ] } ], "source": [ "runtime = SingleThreadedAgentRuntime()\n", "await SimpleAgentWithContext.register(\n", " runtime,\n", " \"simple_agent_context\",\n", " lambda: SimpleAgentWithContext(\n", " OpenAIChatCompletionClient(\n", " model=\"gpt-4o-mini\",\n", " # api_key=\"sk-...\", # Optional if you have an OPENAI_API_KEY set in the environment.\n", " )\n", " ),\n", ")\n", "# Start the runtime processing messages.\n", "runtime.start()\n", "agent_id = AgentId(\"simple_agent_context\", \"default\")\n", "\n", "# First question.\n", "message = Message(\"Hello, what are some fun things to do in Seattle?\")\n", "print(f\"Question: {message.content}\")\n", "response = await runtime.send_message(message, agent_id)\n", "print(f\"Response: {response.content}\")\n", "print(\"-----\")\n", "\n", "# Second question.\n", "message = Message(\"What was the first thing you mentioned?\")\n", "print(f\"Question: {message.content}\")\n", "response = await runtime.send_message(message, agent_id)\n", "print(f\"Response: {response.content}\")\n", "\n", "# Stop the runtime processing messages.\n", "await runtime.stop()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "From the second response, you can see the agent now can recall its own previous responses." ] } ], "metadata": { "kernelspec": { "display_name": "autogen_core", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.11.9" } }, "nbformat": 4, "nbformat_minor": 2 }