{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# Tools\n", "\n", "Tools are code that can be executed by an agent to perform actions. A tool\n", "can be a simple function such as a calculator, or an API call to a third-party service\n", "such as stock price lookup and weather forecast.\n", "In the context of AI agents, tools are designed to be executed by agents in\n", "response to model-generated function calls.\n", "\n", "AutoGen provides the {py:mod}`autogen_core.components.tools` module with a suite of built-in\n", "tools and utilities for creating and running custom tools." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Built-in Tools\n", "\n", "One of the built-in tools is the {py:class}`~autogen_core.components.tools.PythonCodeExecutionTool`,\n", "which allows agents to execute Python code snippets.\n", "\n", "Here is how you create the tool and use it." ] }, { "cell_type": "code", "execution_count": 1, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Hello, world!\n", "\n" ] } ], "source": [ "from autogen_core.base import CancellationToken\n", "from autogen_core.components.tools import PythonCodeExecutionTool\n", "from autogen_ext.code_executor.docker_executor import DockerCommandLineCodeExecutor\n", "\n", "# Create the tool.\n", "code_executor = DockerCommandLineCodeExecutor()\n", "await code_executor.start()\n", "code_execution_tool = PythonCodeExecutionTool(code_executor)\n", "cancellation_token = CancellationToken()\n", "\n", "# Use the tool directly without an agent.\n", "code = \"print('Hello, world!')\"\n", "result = await code_execution_tool.run_json({\"code\": code}, cancellation_token)\n", "print(code_execution_tool.return_value_as_string(result))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The {py:class}`~autogen_core.components.code_executor.docker_executorCommandLineCodeExecutor`\n", "class is a built-in code executor that runs Python code snippets in a subprocess\n", "in the local command line environment.\n", "The {py:class}`~autogen_core.components.tools.PythonCodeExecutionTool` class wraps the code executor\n", "and provides a simple interface to execute Python code snippets.\n", "\n", "Other built-in tools will be added in the future." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Custom Function Tools\n", "\n", "A tool can also be a simple Python function that performs a specific action.\n", "To create a custom function tool, you just need to create a Python function\n", "and use the {py:class}`~autogen_core.components.tools.FunctionTool` class to wrap it.\n", "\n", "For example, a simple tool to obtain the stock price of a company might look like this:" ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "80.44429939059668\n" ] } ], "source": [ "import random\n", "\n", "from autogen_core.base import CancellationToken\n", "from autogen_core.components.tools import FunctionTool\n", "from typing_extensions import Annotated\n", "\n", "\n", "async def get_stock_price(ticker: str, date: Annotated[str, \"Date in YYYY/MM/DD\"]) -> float:\n", " # Returns a random stock price for demonstration purposes.\n", " return random.uniform(10, 200)\n", "\n", "\n", "# Create a function tool.\n", "stock_price_tool = FunctionTool(get_stock_price, description=\"Get the stock price.\")\n", "\n", "# Run the tool.\n", "cancellation_token = CancellationToken()\n", "result = await stock_price_tool.run_json({\"ticker\": \"AAPL\", \"date\": \"2021/01/01\"}, cancellation_token)\n", "\n", "# Print the result.\n", "print(stock_price_tool.return_value_as_string(result))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Tool-Equipped Agent\n", "\n", "To use tools with an agent, you can use {py:class}`~autogen_core.components.tool_agent.ToolAgent`,\n", "by using it in a composition pattern.\n", "Here is an example tool-use agent that uses {py:class}`~autogen_core.components.tool_agent.ToolAgent`\n", "as an inner agent for executing tools." ] }, { "cell_type": "code", "execution_count": 3, "metadata": {}, "outputs": [], "source": [ "from dataclasses import dataclass\n", "from typing import List\n", "\n", "from autogen_core.application import SingleThreadedAgentRuntime\n", "from autogen_core.base import AgentId, AgentInstantiationContext, MessageContext\n", "from autogen_core.components import RoutedAgent, message_handler\n", "from autogen_core.components.models import (\n", " ChatCompletionClient,\n", " LLMMessage,\n", " OpenAIChatCompletionClient,\n", " SystemMessage,\n", " UserMessage,\n", ")\n", "from autogen_core.components.tool_agent import ToolAgent, tool_agent_caller_loop\n", "from autogen_core.components.tools import FunctionTool, Tool, ToolSchema\n", "\n", "\n", "@dataclass\n", "class Message:\n", " content: str\n", "\n", "\n", "class ToolUseAgent(RoutedAgent):\n", " def __init__(self, model_client: ChatCompletionClient, tool_schema: List[ToolSchema], tool_agent_type: str) -> None:\n", " super().__init__(\"An agent with tools\")\n", " self._system_messages: List[LLMMessage] = [SystemMessage(\"You are a helpful AI assistant.\")]\n", " self._model_client = model_client\n", " self._tool_schema = tool_schema\n", " self._tool_agent_id = AgentId(tool_agent_type, self.id.key)\n", "\n", " @message_handler\n", " async def handle_user_message(self, message: Message, ctx: MessageContext) -> Message:\n", " # Create a session of messages.\n", " session: List[LLMMessage] = [UserMessage(content=message.content, source=\"user\")]\n", " # Run the caller loop to handle tool calls.\n", " messages = await tool_agent_caller_loop(\n", " self,\n", " tool_agent_id=self._tool_agent_id,\n", " model_client=self._model_client,\n", " input_messages=session,\n", " tool_schema=self._tool_schema,\n", " cancellation_token=ctx.cancellation_token,\n", " )\n", " # Return the final response.\n", " assert isinstance(messages[-1].content, str)\n", " return Message(content=messages[-1].content)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The `ToolUseAgent` class uses a convenience function {py:meth}`~autogen_core.components.tool_agent.tool_agent_caller_loop`, \n", "to handle the interaction between the model and the tool agent.\n", "The core idea can be described using a simple control flow graph:\n", "\n", "![ToolUseAgent control flow graph](tool-use-agent-cfg.svg)\n", "\n", "The `ToolUseAgent`'s `handle_user_message` handler handles messages from the user,\n", "and determines whether the model has generated a tool call.\n", "If the model has generated tool calls, then the handler sends a function call\n", "message to the {py:class}`~autogen_core.components.tool_agent.ToolAgent` agent\n", "to execute the tools,\n", "and then queries the model again with the results of the tool calls.\n", "This process continues until the model stops generating tool calls,\n", "at which point the final response is returned to the user.\n", "\n", "By having the tool execution logic in a separate agent,\n", "we expose the model-tool interactions to the agent runtime as messages, so the tool executions\n", "can be observed externally and intercepted if necessary.\n", "\n", "To run the agent, we need to create a runtime and register the agent." ] }, { "cell_type": "code", "execution_count": 4, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "AgentType(type='tool_use_agent')" ] }, "execution_count": 4, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# Create a runtime.\n", "runtime = SingleThreadedAgentRuntime()\n", "# Create the tools.\n", "tools: List[Tool] = [FunctionTool(get_stock_price, description=\"Get the stock price.\")]\n", "# Register the agents.\n", "await ToolAgent.register(runtime, \"tool_executor_agent\", lambda: ToolAgent(\"tool executor agent\", tools))\n", "await ToolUseAgent.register(\n", " runtime,\n", " \"tool_use_agent\",\n", " lambda: ToolUseAgent(\n", " OpenAIChatCompletionClient(model=\"gpt-4o-mini\"), [tool.schema for tool in tools], \"tool_executor_agent\"\n", " ),\n", ")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "This example uses the {py:class}`autogen_core.components.models.OpenAIChatCompletionClient`,\n", "for Azure OpenAI and other clients, see [Model Clients](./model-clients.ipynb).\n", "Let's test the agent with a question about stock price." ] }, { "cell_type": "code", "execution_count": 5, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "The stock price of NVDA (NVIDIA Corporation) on June 1, 2024, was approximately $179.46.\n" ] } ], "source": [ "# Start processing messages.\n", "runtime.start()\n", "# Send a direct message to the tool agent.\n", "tool_use_agent = AgentId(\"tool_use_agent\", \"default\")\n", "response = await runtime.send_message(Message(\"What is the stock price of NVDA on 2024/06/01?\"), tool_use_agent)\n", "print(response.content)\n", "# Stop processing messages.\n", "await runtime.stop()" ] } ], "metadata": { "kernelspec": { "display_name": "autogen_core", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.11.9" } }, "nbformat": 4, "nbformat_minor": 2 }