ai-agents-for-beginners

Exploring AI Agent Frameworks

(Click di image wey dey up to watch di video for dis lesson)

Explore AI Agent Frameworks

AI agent frameworks na software platforms wey dey make am easy to create, deploy, and manage AI agents. Dis frameworks dey give developers ready-made components, tools, and ways wey go make di work of building complex AI systems no too hard.

Dis frameworks dey help developers focus on di special things wey dem wan do for dia apps by providing standard ways to handle di common wahala wey dey for AI agent development. E dey make am easy to scale, use, and build AI systems wey go work well.

Introduction

Dis lesson go cover:

Learning goals

Di goals for dis lesson na to help you sabi:

Wetin AI Agent Frameworks be and wetin dem fit help developers do?

Di normal AI Frameworks fit help you put AI for your apps and make di apps better for di following ways:

E sound good abi? So why we need AI Agent Framework?

AI Agent frameworks dey do pass di normal AI frameworks. Dem dey help create smart agents wey fit interact with users, other agents, and di environment to achieve specific goals. Dis agents fit act on dia own, make decisions, and change as di situation dey change. Make we look di key things wey AI Agent Frameworks fit do:

So to summarize, agents dey help you do more, take automation to di next level, and create smarter systems wey fit learn and change based on wetin dey happen.

How to quickly prototype, change, and improve di agent’s capabilities?

Dis area dey move fast, but some things dey common for most AI Agent Frameworks wey fit help you quickly prototype and change, like module components, tools for collaboration, and real-time learning. Make we look dem:

Use Modular Components

SDKs like Microsoft Semantic Kernel and LangChain dey provide ready-made components like AI connectors, prompt templates, and memory management.

How teams fit use dem: Teams fit quickly join dis components to create prototype wey dey work without starting from scratch, e go make dem fit test and change fast.

How e dey work for real life: You fit use ready-made parser to get information from user input, memory module to store and bring back data, and prompt generator to interact with users, all without building dem from scratch.

Example code. Make we look example of how you fit use ready-made AI Connector with Semantic Kernel Python and .Net wey dey use auto-function calling to make di model reply user input:

# Semantic Kernel Python Example

import asyncio
from typing import Annotated

from semantic_kernel.connectors.ai import FunctionChoiceBehavior
from semantic_kernel.connectors.ai.open_ai import AzureChatCompletion, AzureChatPromptExecutionSettings
from semantic_kernel.contents import ChatHistory
from semantic_kernel.functions import kernel_function
from semantic_kernel.kernel import Kernel

# Define a ChatHistory object to hold the conversation's context
chat_history = ChatHistory()
chat_history.add_user_message("I'd like to go to New York on January 1, 2025")


# Define a sample plugin that contains the function to book travel
class BookTravelPlugin:
    """A Sample Book Travel Plugin"""

    @kernel_function(name="book_flight", description="Book travel given location and date")
    async def book_flight(
        self, date: Annotated[str, "The date of travel"], location: Annotated[str, "The location to travel to"]
    ) -> str:
        return f"Travel was booked to {location} on {date}"

# Create the Kernel
kernel = Kernel()

# Add the sample plugin to the Kernel object
kernel.add_plugin(BookTravelPlugin(), plugin_name="book_travel")

# Define the Azure OpenAI AI Connector
chat_service = AzureChatCompletion(
    deployment_name="YOUR_DEPLOYMENT_NAME", 
    api_key="YOUR_API_KEY", 
    endpoint="https://<your-resource>.azure.openai.com/",
)

# Define the request settings to configure the model with auto-function calling
request_settings = AzureChatPromptExecutionSettings(function_choice_behavior=FunctionChoiceBehavior.Auto())


async def main():
    # Make the request to the model for the given chat history and request settings
    # The Kernel contains the sample that the model will request to invoke
    response = await chat_service.get_chat_message_content(
        chat_history=chat_history, settings=request_settings, kernel=kernel
    )
    assert response is not None

    """
    Note: In the auto function calling process, the model determines it can invoke the 
    `BookTravelPlugin` using the `book_flight` function, supplying the necessary arguments. 
    
    For example:

    "tool_calls": [
        {
            "id": "call_abc123",
            "type": "function",
            "function": {
                "name": "BookTravelPlugin-book_flight",
                "arguments": "{'location': 'New York', 'date': '2025-01-01'}"
            }
        }
    ]

    Since the location and date arguments are required (as defined by the kernel function), if the 
    model lacks either, it will prompt the user to provide them. For instance:

    User: Book me a flight to New York.
    Model: Sure, I'd love to help you book a flight. Could you please specify the date?
    User: I want to travel on January 1, 2025.
    Model: Your flight to New York on January 1, 2025, has been successfully booked. Safe travels!
    """

    print(f"`{response}`")
    # Example AI Model Response: `Your flight to New York on January 1, 2025, has been successfully booked. Safe travels! ✈️🗽`

    # Add the model's response to our chat history context
    chat_history.add_assistant_message(response.content)


if __name__ == "__main__":
    asyncio.run(main())
// Semantic Kernel C# example

using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.ChatCompletion;
using System.ComponentModel;
using Microsoft.SemanticKernel.Connectors.AzureOpenAI;

ChatHistory chatHistory = [];
chatHistory.AddUserMessage("I'd like to go to New York on January 1, 2025");

var kernelBuilder = Kernel.CreateBuilder();
kernelBuilder.AddAzureOpenAIChatCompletion(
    deploymentName: "NAME_OF_YOUR_DEPLOYMENT",
    apiKey: "YOUR_API_KEY",
    endpoint: "YOUR_AZURE_ENDPOINT"
);
kernelBuilder.Plugins.AddFromType<BookTravelPlugin>("BookTravel"); 
var kernel = kernelBuilder.Build();

var settings = new AzureOpenAIPromptExecutionSettings()
{
    FunctionChoiceBehavior = FunctionChoiceBehavior.Auto()
};

var chatCompletion = kernel.GetRequiredService<IChatCompletionService>();

var response = await chatCompletion.GetChatMessageContentAsync(chatHistory, settings, kernel);

/*
Behind the scenes, the model recognizes the tool to call, what arguments it already has (location) and (date)
{

"tool_calls": [
    {
        "id": "call_abc123",
        "type": "function",
        "function": {
            "name": "BookTravelPlugin-book_flight",
            "arguments": "{'location': 'New York', 'date': '2025-01-01'}"
        }
    }
]
*/

Console.WriteLine(response.Content);
chatHistory.AddMessage(response!.Role, response!.Content!);

// Example AI Model Response: Your flight to New York on January 1, 2025, has been successfully booked. Safe travels! ✈️🗽

// Define a plugin that contains the function to book travel
public class BookTravelPlugin
{
    [KernelFunction("book_flight")]
    [Description("Book travel given location and date")]
    public async Task<string> BookFlight(DateTime date, string location)
    {
        return await Task.FromResult( $"Travel was booked to {location} on {date}");
    }
}

From dis example, you fit see how you fit use ready-made parser to get key information from user input, like di origin, destination, and date for flight booking request. Dis modular way dey help you focus on di main logic.

Leverage Collaborative Tools

Frameworks like CrewAI, Microsoft AutoGen, and Semantic Kernel dey help create plenty agents wey fit work together.

How teams fit use dem: Teams fit design agents wey get specific roles and tasks, e go help test and improve how dem dey work together and make di system dey efficient.

How e dey work for real life: You fit create team of agents wey each one get special work, like data retrieval, analysis, or decision-making. Dis agents fit talk and share information to achieve one goal, like answer user question or complete task.

Example code (AutoGen):

# creating agents, then create a round robin schedule where they can work together, in this case in order

# Data Retrieval Agent
# Data Analysis Agent
# Decision Making Agent

agent_retrieve = AssistantAgent(
    name="dataretrieval",
    model_client=model_client,
    tools=[retrieve_tool],
    system_message="Use tools to solve tasks."
)

agent_analyze = AssistantAgent(
    name="dataanalysis",
    model_client=model_client,
    tools=[analyze_tool],
    system_message="Use tools to solve tasks."
)

# conversation ends when user says "APPROVE"
termination = TextMentionTermination("APPROVE")

user_proxy = UserProxyAgent("user_proxy", input_func=input)

team = RoundRobinGroupChat([agent_retrieve, agent_analyze, user_proxy], termination_condition=termination)

stream = team.run_stream(task="Analyze data", max_turns=10)
# Use asyncio.run(...) when running in a script.
await Console(stream)

For di code wey dey up, you fit see how you fit create task wey involve plenty agents wey dey work together to analyze data. Each agent dey do one specific work, and di task dey happen by coordinating di agents to achieve di goal. By creating agents wey get special roles, you fit make di task dey efficient and perform well.

Learn in Real-Time

Advanced frameworks dey provide ways for real-time understanding and adaptation.

How teams fit use dem: Teams fit put feedback loops wey go make agents learn from interactions and change how dem dey behave, e go lead to continuous improvement.

How e dey work for real life: Agents fit check user feedback, environmental data, and task results to update dia knowledge, change how dem dey make decisions, and perform better over time. Dis learning process dey help agents adapt to wetin dey happen and wetin users want, e dey make di system work better.

Wetin be di difference between di frameworks AutoGen, Semantic Kernel and Azure AI Agent Service?

Plenty ways dey to compare dis frameworks, but make we look di main difference for dia design, wetin dem fit do, and di kind work dem dey good for:

AutoGen

AutoGen na open-source framework wey Microsoft Research’s AI Frontiers Lab develop. E dey focus on event-driven, distributed agentic applications, wey dey allow plenty LLMs and SLMs, tools, and advanced multi-agent design patterns.

AutoGen dey work around di idea of agents, wey be independent entities wey fit understand dia environment, make decisions, and take actions to achieve specific goals. Agents dey talk through asynchronous messages, e dey make dem work on dia own and at di same time, e dey make di system scale and respond well.

Agents dey based on di actor model. According to Wikipedia, actor na di basic building block of concurrent computation. When actor receive message, e fit: make local decisions, create more actors, send more messages, and decide how e go reply di next message wey e receive.

Use Cases: Automating code generation, data analysis tasks, and building custom agents for planning and research functions.

Here be di main ideas for AutoGen:

Semantic Kernel + Agent Framework

Semantic Kernel na enterprise-ready AI Orchestration SDK. E get AI and memory connectors, plus Agent Framework.

Make we first talk about di main components:

Dis facts dem go dey store for memory collection wey dem call SummarizedAzureDocs. Dis na one simple example, but e show how you fit store info for memory wey LLM go use.

So na di basics of Semantic Kernel framework be dis, wetin we go talk about di Agent Framework?

Azure AI Agent Service

Azure AI Agent Service na new thing wey dem introduce for Microsoft Ignite 2024. E dey allow people develop and deploy AI agents wey get more flexible models, like to call open-source LLMs like Llama 3, Mistral, and Cohere directly.

Azure AI Agent Service dey provide strong enterprise security and data storage methods, wey make am good for enterprise applications.

E dey work straight out-of-the-box with multi-agent orchestration frameworks like AutoGen and Semantic Kernel.

Dis service dey Public Preview now and e dey support Python and C# to build agents.

With Semantic Kernel Python, we fit create Azure AI Agent wey get user-defined plugin:

import asyncio
from typing import Annotated

from azure.identity.aio import DefaultAzureCredential

from semantic_kernel.agents import AzureAIAgent, AzureAIAgentSettings, AzureAIAgentThread
from semantic_kernel.contents import ChatMessageContent
from semantic_kernel.contents import AuthorRole
from semantic_kernel.functions import kernel_function


# Define a sample plugin for the sample
class MenuPlugin:
    """A sample Menu Plugin used for the concept sample."""

    @kernel_function(description="Provides a list of specials from the menu.")
    def get_specials(self) -> Annotated[str, "Returns the specials from the menu."]:
        return """
        Special Soup: Clam Chowder
        Special Salad: Cobb Salad
        Special Drink: Chai Tea
        """

    @kernel_function(description="Provides the price of the requested menu item.")
    def get_item_price(
        self, menu_item: Annotated[str, "The name of the menu item."]
    ) -> Annotated[str, "Returns the price of the menu item."]:
        return "$9.99"


async def main() -> None:
    ai_agent_settings = AzureAIAgentSettings.create()

    async with (
        DefaultAzureCredential() as creds,
        AzureAIAgent.create_client(
            credential=creds,
            conn_str=ai_agent_settings.project_connection_string.get_secret_value(),
        ) as client,
    ):
        # Create agent definition
        agent_definition = await client.agents.create_agent(
            model=ai_agent_settings.model_deployment_name,
            name="Host",
            instructions="Answer questions about the menu.",
        )

        # Create the AzureAI Agent using the defined client and agent definition
        agent = AzureAIAgent(
            client=client,
            definition=agent_definition,
            plugins=[MenuPlugin()],
        )

        # Create a thread to hold the conversation
        # If no thread is provided, a new thread will be
        # created and returned with the initial response
        thread: AzureAIAgentThread | None = None

        user_inputs = [
            "Hello",
            "What is the special soup?",
            "How much does that cost?",
            "Thank you",
        ]

        try:
            for user_input in user_inputs:
                print(f"# User: '{user_input}'")
                # Invoke the agent for the specified thread
                response = await agent.get_response(
                    messages=user_input,
                    thread_id=thread,
                )
                print(f"# {response.name}: {response.content}")
                thread = response.thread
        finally:
            await thread.delete() if thread else None
            await client.agents.delete_agent(agent.id)


if __name__ == "__main__":
    asyncio.run(main())

Core concepts

Azure AI Agent Service get di following core concepts:

Use Cases: Azure AI Agent Service na for enterprise applications wey need secure, scalable, and flexible AI agent deployment.

Wetin be di difference between dis frameworks?

E be like say di frameworks get plenty similarities, but dem get key differences for di way dem dey designed, wetin dem fit do, and di kind use cases dem dey target:

Still dey confused about which one to choose?

Use Cases

Make we try help you by going through some common use cases:

Q: I dey experiment, learn, and dey build proof-of-concept agent applications, and I want make I fit build and experiment fast

A: AutoGen go be better choice for dis scenario, as e dey focus on event-driven, distributed agentic applications and e dey support advanced multi-agent design patterns.

Q: Wetin make AutoGen better pass Semantic Kernel and Azure AI Agent Service for dis use case?

A: AutoGen dey specially designed for event-driven, distributed agentic applications, wey make am good for automating code generation and data analysis tasks. E dey provide di tools and capabilities wey you need to build complex multi-agent systems well.

Q: E be like say Azure AI Agent Service fit work for dis use case too, e get tools for code generation and more?

A: Yes, Azure AI Agent Service na platform service for agents and e get built-in capabilities for multiple models, Azure AI Search, Bing Search, and Azure Functions. E dey make am easy to build your agents for di Foundry Portal and deploy dem at scale.

Q: I still dey confused, just give me one option

A: Di best choice na to build your application for Semantic Kernel first, then use Azure AI Agent Service to deploy your agent. Dis approach go allow you persist your agents easily while you dey use di power to build multi-agent systems for Semantic Kernel. Plus, Semantic Kernel get connector for AutoGen, wey make am easy to use di two frameworks together.

Make we summarize di key differences for table:

Framework Focus Core Concepts Use Cases
AutoGen Event-driven, distributed agentic applications Agents, Personas, Functions, Data Code generation, data analysis tasks
Semantic Kernel Understanding and generating human-like text content Agents, Modular Components, Collaboration Natural language understanding, content generation
Azure AI Agent Service Flexible models, enterprise security, Code generation, Tool calling Modularity, Collaboration, Process Orchestration Secure, scalable, and flexible AI agent deployment

Wetin be di ideal use case for each of dis frameworks?

Fit I integrate my existing Azure ecosystem tools directly, or I need standalone solutions?

Di answer na yes, you fit integrate your existing Azure ecosystem tools directly with Azure AI Agent Service especially, because e dey built to work well with other Azure services. You fit integrate Bing, Azure AI Search, and Azure Functions. E also get deep integration with Azure AI Foundry.

For AutoGen and Semantic Kernel, you fit also integrate with Azure services, but e fit require you to call di Azure services from your code. Another way to integrate na to use di Azure SDKs to interact with Azure services from your agents. Plus, like dem talk before, you fit use Azure AI Agent Service as orchestrator for your agents wey you build for AutoGen or Semantic Kernel, wey go make am easy to access di Azure ecosystem.

Get More Questions about AI Agent Frameworks?

Join di Azure AI Foundry Discord to meet other learners, attend office hours, and get answers to your AI Agents questions.

References

Previous Lesson

Introduction to AI Agents and Agent Use Cases

Next Lesson

Understanding Agentic Design Patterns


Disclaimer:
Dis dokyument don use AI transle-shon service Co-op Translator do di transle-shon. Even as we dey try make am accurate, abeg make you sabi say transle-shon wey machine do fit get mistake or no dey correct well. Di original dokyument for im native language na di one wey you go take as di correct source. For important mata, e good make you use professional human transle-shon. We no go fit take blame for any misunderstanding or wrong interpretation wey fit happen because you use dis transle-shon.