{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# Serializing Components \n", "\n", "AutoGen provides a {py:class}`~autogen_core.Component` configuration class that defines behaviours to serialize/deserialize component into declarative specifications. We can accomplish this by calling `.dump_component()` and `.load_component()` respectively. This is useful for debugging, visualizing, and even for sharing your work with others. In this notebook, we will demonstrate how to serialize multiple components to a declarative specification like a JSON file. \n", "\n", "\n", "```{warning}\n", "\n", "ONLY LOAD COMPONENTS FROM TRUSTED SOURCES.\n", "\n", "With serilized components, each component implements the logic for how it is serialized and deserialized - i.e., how the declarative specification is generated and how it is converted back to an object. \n", "\n", "In some cases, creating an object may include executing code (e.g., a serialized function). ONLY LOAD COMPONENTS FROM TRUSTED SOURCES. \n", " \n", "```\n", "\n", " \n", "### Termination Condition Example \n", "\n", "In the example below, we will define termination conditions (a part of an agent team) in python, export this to a dictionary/json and also demonstrate how the termination condition object can be loaded from the dictionary/json. \n", " " ] }, { "cell_type": "code", "execution_count": 1, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Config: {\"provider\":\"autogen_agentchat.base.OrTerminationCondition\",\"component_type\":\"termination\",\"version\":1,\"component_version\":1,\"description\":null,\"config\":{\"conditions\":[{\"provider\":\"autogen_agentchat.conditions.MaxMessageTermination\",\"component_type\":\"termination\",\"version\":1,\"component_version\":1,\"config\":{\"max_messages\":5}},{\"provider\":\"autogen_agentchat.conditions.StopMessageTermination\",\"component_type\":\"termination\",\"version\":1,\"component_version\":1,\"config\":{}}]}}\n" ] } ], "source": [ "from autogen_agentchat.conditions import MaxMessageTermination, StopMessageTermination\n", "\n", "max_termination = MaxMessageTermination(5)\n", "stop_termination = StopMessageTermination()\n", "\n", "or_termination = max_termination | stop_termination\n", "\n", "or_term_config = or_termination.dump_component()\n", "print(\"Config: \", or_term_config.model_dump_json())\n", "\n", "new_or_termination = or_termination.load_component(or_term_config)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Agent Example \n", "\n", "In the example below, we will define an agent in python, export this to a dictionary/json and also demonstrate how the agent object can be loaded from the dictionary/json." ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [], "source": [ "from autogen_agentchat.agents import AssistantAgent, UserProxyAgent\n", "from autogen_ext.models.openai import OpenAIChatCompletionClient\n", "\n", "# Create an agent that uses the OpenAI GPT-4o model.\n", "model_client = OpenAIChatCompletionClient(\n", " model=\"gpt-4o\",\n", " # api_key=\"YOUR_API_KEY\",\n", ")\n", "agent = AssistantAgent(\n", " name=\"assistant\",\n", " model_client=model_client,\n", " handoffs=[\"flights_refunder\", \"user\"],\n", " # tools=[], # serializing tools is not yet supported\n", " system_message=\"Use tools to solve tasks.\",\n", ")\n", "user_proxy = UserProxyAgent(name=\"user\")" ] }, { "cell_type": "code", "execution_count": 3, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "{\"provider\":\"autogen_agentchat.agents.UserProxyAgent\",\"component_type\":\"agent\",\"version\":1,\"component_version\":1,\"description\":null,\"config\":{\"name\":\"user\",\"description\":\"A human user\"}}\n" ] } ], "source": [ "user_proxy_config = user_proxy.dump_component() # dump component\n", "print(user_proxy_config.model_dump_json())\n", "up_new = user_proxy.load_component(user_proxy_config) # load component" ] }, { "cell_type": "code", "execution_count": 4, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "{\"provider\":\"autogen_agentchat.agents.AssistantAgent\",\"component_type\":\"agent\",\"version\":1,\"component_version\":1,\"description\":null,\"config\":{\"name\":\"assistant\",\"model_client\":{\"provider\":\"autogen_ext.models.openai.OpenAIChatCompletionClient\",\"component_type\":\"model\",\"version\":1,\"component_version\":1,\"config\":{\"model\":\"gpt-4o\"}},\"handoffs\":[{\"target\":\"flights_refunder\",\"description\":\"Handoff to flights_refunder.\",\"name\":\"transfer_to_flights_refunder\",\"message\":\"Transferred to flights_refunder, adopting the role of flights_refunder immediately.\"},{\"target\":\"user\",\"description\":\"Handoff to user.\",\"name\":\"transfer_to_user\",\"message\":\"Transferred to user, adopting the role of user immediately.\"}],\"model_context\":{\"provider\":\"autogen_core.model_context.UnboundedChatCompletionContext\",\"component_type\":\"chat_completion_context\",\"version\":1,\"component_version\":1,\"config\":{}},\"description\":\"An agent that provides assistance with ability to use tools.\",\"system_message\":\"Use tools to solve tasks.\",\"reflect_on_tool_use\":false,\"tool_call_summary_format\":\"{result}\"}}\n" ] } ], "source": [ "agent_config = agent.dump_component() # dump component\n", "print(agent_config.model_dump_json())\n", "agent_new = agent.load_component(agent_config) # load component" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "A similar approach can be used to serialize the `MultiModalWebSurfer` agent.\n", "\n", "```python\n", "from autogen_ext.agents.web_surfer import MultimodalWebSurfer\n", "\n", "agent = MultimodalWebSurfer(\n", " name=\"web_surfer\",\n", " model_client=model_client,\n", " headless=False,\n", ")\n", "\n", "web_surfer_config = agent.dump_component() # dump component\n", "print(web_surfer_config.model_dump_json())\n", "\n", "```" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Team Example\n", "\n", "In the example below, we will define a team in python, export this to a dictionary/json and also demonstrate how the team object can be loaded from the dictionary/json." ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "{\"provider\":\"autogen_agentchat.teams.RoundRobinGroupChat\",\"component_type\":\"team\",\"version\":1,\"component_version\":1,\"description\":null,\"config\":{\"participants\":[{\"provider\":\"autogen_agentchat.agents.AssistantAgent\",\"component_type\":\"agent\",\"version\":1,\"component_version\":1,\"config\":{\"name\":\"assistant\",\"model_client\":{\"provider\":\"autogen_ext.models.openai.OpenAIChatCompletionClient\",\"component_type\":\"model\",\"version\":1,\"component_version\":1,\"config\":{\"model\":\"gpt-4o\"}},\"handoffs\":[{\"target\":\"flights_refunder\",\"description\":\"Handoff to flights_refunder.\",\"name\":\"transfer_to_flights_refunder\",\"message\":\"Transferred to flights_refunder, adopting the role of flights_refunder immediately.\"},{\"target\":\"user\",\"description\":\"Handoff to user.\",\"name\":\"transfer_to_user\",\"message\":\"Transferred to user, adopting the role of user immediately.\"}],\"model_context\":{\"provider\":\"autogen_core.model_context.UnboundedChatCompletionContext\",\"component_type\":\"chat_completion_context\",\"version\":1,\"component_version\":1,\"config\":{}},\"description\":\"An agent that provides assistance with ability to use tools.\",\"system_message\":\"Use tools to solve tasks.\",\"reflect_on_tool_use\":false,\"tool_call_summary_format\":\"{result}\"}}],\"termination_condition\":{\"provider\":\"autogen_agentchat.conditions.MaxMessageTermination\",\"component_type\":\"termination\",\"version\":1,\"component_version\":1,\"config\":{\"max_messages\":2}}}}\n" ] } ], "source": [ "from autogen_agentchat.agents import AssistantAgent, UserProxyAgent\n", "from autogen_agentchat.conditions import MaxMessageTermination\n", "from autogen_agentchat.teams import RoundRobinGroupChat\n", "from autogen_ext.models.openai import OpenAIChatCompletionClient\n", "\n", "# Create an agent that uses the OpenAI GPT-4o model.\n", "model_client = OpenAIChatCompletionClient(\n", " model=\"gpt-4o\",\n", " # api_key=\"YOUR_API_KEY\",\n", ")\n", "agent = AssistantAgent(\n", " name=\"assistant\",\n", " model_client=model_client,\n", " handoffs=[\"flights_refunder\", \"user\"],\n", " # tools=[], # serializing tools is not yet supported\n", " system_message=\"Use tools to solve tasks.\",\n", ")\n", "\n", "team = RoundRobinGroupChat(participants=[agent], termination_condition=MaxMessageTermination(2))\n", "\n", "team_config = team.dump_component() # dump component\n", "print(team_config.model_dump_json())" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] } ], "metadata": { "kernelspec": { "display_name": ".venv", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.11.9" } }, "nbformat": 4, "nbformat_minor": 2 }