{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# Models\n", "\n", "In many cases, agents need access to model services such as OpenAI, Azure OpenAI, and local models.\n", "AgentChat utilizes model clients provided by the\n", "[`autogen-ext`](../../core-user-guide/framework/model-clients.ipynb) package." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## OpenAI\n", "\n", "To access OpenAI models, you need to install the `openai` extension to use the {py:class}`~autogen_ext.models.OpenAIChatCompletionClient`." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "vscode": { "languageId": "shellscript" } }, "outputs": [], "source": [ "pip install 'autogen-ext[openai]==0.4.0.dev6'" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "You will also need to obtain an [API key](https://platform.openai.com/account/api-keys) from OpenAI." ] }, { "cell_type": "code", "execution_count": 10, "metadata": {}, "outputs": [], "source": [ "from autogen_ext.models import OpenAIChatCompletionClient\n", "\n", "opneai_model_client = OpenAIChatCompletionClient(\n", " model=\"gpt-4o-2024-08-06\",\n", " # api_key=\"sk-...\", # Optional if you have an OPENAI_API_KEY environment variable set.\n", ")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "To test the model client, you can use the following code:" ] }, { "cell_type": "code", "execution_count": 11, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "CreateResult(finish_reason='stop', content='The capital of France is Paris.', usage=RequestUsage(prompt_tokens=15, completion_tokens=7), cached=False, logprobs=None)\n" ] } ], "source": [ "from autogen_core.components.models import UserMessage\n", "\n", "result = await opneai_model_client.create([UserMessage(content=\"What is the capital of France?\", source=\"user\")])\n", "print(result)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Azure OpenAI\n", "\n", "Install the `azure` and `openai` extensions to use the {py:class}`~autogen_ext.models.AzureOpenAIChatCompletionClient`." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "vscode": { "languageId": "shellscript" } }, "outputs": [], "source": [ "pip install 'autogen-ext[openai,azure]==0.4.0.dev6'" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "To use the client, you need to provide your deployment id, Azure Cognitive Services endpoint, api version, and model capabilities.\n", "For authentication, you can either provide an API key or an Azure Active Directory (AAD) token credential.\n", "\n", "The following code snippet shows how to use AAD authentication.\n", "The identity used must be assigned the [Cognitive Services OpenAI User](https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/role-based-access-control#cognitive-services-openai-user) role." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "from autogen_ext.models import AzureOpenAIChatCompletionClient\n", "from azure.identity import DefaultAzureCredential, get_bearer_token_provider\n", "\n", "# Create the token provider\n", "token_provider = get_bearer_token_provider(DefaultAzureCredential(), \"https://cognitiveservices.azure.com/.default\")\n", "\n", "az_model_client = AzureOpenAIChatCompletionClient(\n", " model=\"{your-azure-deployment}\",\n", " api_version=\"2024-06-01\",\n", " azure_endpoint=\"https://{your-custom-endpoint}.openai.azure.com/\",\n", " azure_ad_token_provider=token_provider, # Optional if you choose key-based authentication.\n", " # api_key=\"sk-...\", # For key-based authentication.\n", " model_capabilities={\n", " \"vision\": True,\n", " \"function_calling\": True,\n", " \"json_output\": True,\n", " },\n", ")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "See [here](https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/managed-identity#chat-completions) for how to use the Azure client directly or for more info." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Local Models\n", "\n", "We are working on it. Stay tuned!" ] } ], "metadata": { "kernelspec": { "display_name": ".venv", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.11.5" } }, "nbformat": 4, "nbformat_minor": 2 }