(Klik di piksha we de ontop so to watch video for dis leson)
Tools dey interesting becos dem dey allow AI agents make dem get plenty kain tin wey dem fit do. E no be say the agent get small amount of tins wey e fit do, but if you add tool, the agent fit do plenty tins. For dis chapter, we go check Tool Use Design Pattern, wey dey explain how AI agents fit use correct tools take achieve their goals.
For dis leson, we want answer di tins dem:
After you don finish dis leson, you go fit:
The Tool Use Design Pattern na how we dey give LLMs power to interact with tools wey dey outside make dem fit achieve particular goals. Tools na code wey agent fit run to do certain actions. Tool fit be simple function like calculator, or e fit be API call to another service like checking stock price or weather forecast. For AI agents matter, tools na wetin agent go run after model-generated function calls.
AI Agents fit use tools to finish complex work, find information, or make decisions. The tool use design pattern dey usually used when dynamic interaction with outside system like databases, web services, or code interpreters. Dis kain ability dey useful for many diff kinds cases like:
Dis building blocks dey allow AI agent fit do many tasks. Make we look di main tins wey we need to do Tool Use Design Pattern:
Function/Tool Schemas: Detailed definition of tools wey dey available, including function name, wetin e dey do, parameters wey e need, plus wetin e go return. These schemas go help LLM sabi which tools dem get and how to do correct request.
Function Execution Logic: How and wen tools go dey called based on wetin user want and conversation context. E fit get planner modules, routing, or conditional flow wey decide how to use tool for correct time.
Message Handling System: Parts wey dey control conversation flow between user talk, LLM response, tool calls, and tool answer dem.
Tool Integration Framework: Infrastructure wey connect agent with different tools, whether na simple functions or complex outside services.
Error Handling & Validation: Ways to manage when tool no work well, check parameters, and manage unexpected answers.
State Management: Keep track of conversation context, past tool calls, and maintain data across multiple turns.
Next, make we look Function/Tool Calling more inside.
Function calling na main way we take enable Large Language Models (LLMs) to interact with tools. You go often see ‘Function’ and ‘Tool’ dey used like say na the same thing because ‘functions’ (block of code wey you fit use many times) na ‘tools’ wey agents dey use to do work. To run function code, LLM must compare user request with function description. To do dis, we send schema wey get all function description to LLM. Then LLM go pick best function for di work and return im name plus arguments. The chosen function go run, response go return to LLM, then LLM go use am respond to user.
For developers to implement function calling for agents, you need:
Make we use example of getting current time for one city to explain:
Initialize LLM wey fit do function calling:
No all models fit do function calling, e important to check whether di LLM wey you dey use fit do am. Azure OpenAI fit function calling. We fit start with Azure OpenAI client.
# Make we start di Azure OpenAI client
client = AzureOpenAI(
azure_endpoint = os.getenv("AZURE_AI_PROJECT_ENDPOINT"),
api_key=os.getenv("AZURE_OPENAI_API_KEY"),
api_version="2024-05-01-preview"
)
Make Function Schema:
Next, we go define JSON schema wey get function name, description of wetin function dey do, plus names and description of parameters. We go carry dis schema give di client we create before, plus user request to find time for San Francisco. Wetin we gats remember be say tool call na wetin go return, no be di final answer to question. As we talk before, LLM na e go return function name wey e pick and arguments wey e go pass.
# Function tok for di model to read
tools = [
{
"type": "function",
"function": {
"name": "get_current_time",
"description": "Get the current time in a given location",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city name, e.g. San Francisco",
},
},
"required": ["location"],
},
}
}
]
# Di first message wey di user send
messages = [{"role": "user", "content": "What's the current time in San Francisco"}]
# Di first API call: Make di model use di function
response = client.chat.completions.create(
model=deployment_name,
messages=messages,
tools=tools,
tool_choice="auto",
)
# Make we process wetin di model come reply
response_message = response.choices[0].message
messages.append(response_message)
print("Model's response:")
print(response_message)
Model's response:
ChatCompletionMessage(content=None, role='assistant', function_call=None, tool_calls=[ChatCompletionMessageToolCall(id='call_pOsKdUlqvdyttYB67MOj434b', function=Function(arguments='{"location":"San Francisco"}', name='get_current_time'), type='function')])
Function code wey go run the task:
Now say LLM don pick which function to run, we need implement and run the code wey go carry out the work. We fit write code to get current time inside Python. We go also write code to take name and arguments from response_message to get final result.
def get_current_time(location):
"""Get the current time for a given location"""
print(f"get_current_time called with location: {location}")
location_lower = location.lower()
for key, timezone in TIMEZONE_DATA.items():
if key in location_lower:
print(f"Timezone found for {key}")
current_time = datetime.now(ZoneInfo(timezone)).strftime("%I:%M %p")
return json.dumps({
"location": location,
"current_time": current_time
})
print(f"No timezone data found for {location_lower}")
return json.dumps({"location": location, "current_time": "unknown"})
# Manage how function dem de call
if response_message.tool_calls:
for tool_call in response_message.tool_calls:
if tool_call.function.name == "get_current_time":
function_args = json.loads(tool_call.function.arguments)
time_response = get_current_time(
location=function_args.get("location")
)
messages.append({
"tool_call_id": tool_call.id,
"role": "tool",
"name": "get_current_time",
"content": time_response,
})
else:
print("No tool calls were made by the model.")
# Second API call: Make we collect the last response from the model
final_response = client.chat.completions.create(
model=deployment_name,
messages=messages,
)
return final_response.choices[0].message.content
get_current_time called with location: San Francisco
Timezone found for san francisco
The current time in San Francisco is 09:24 AM.
Function Calling na heart of most, if no be all agent tool use design, but e fit hard sometimes to implement am from scratch. As we learn for Lesson 2, agentic frameworks dey give us pre-built building blocks to implement tool use.
Here na some examples how you fit implement Tool Use Design Pattern using different agentic frameworks:
Microsoft Agent Framework na open-source AI framework to build AI agents. E make function calling easy by letting you define tools as Python functions using @tool decorator. Di framework dey handle communication between model and your code. E also give access to pre-built tools like File Search and Code Interpreter through AzureAIProjectAgentProvider.
The diagram below show how function calling dey work with Microsoft Agent Framework:

For Microsoft Agent Framework, tools dem be decorated functions. We fit change get_current_time function wey we see before to tool by using @tool decorator. Di framework go serialize function and im parameters, and create schema to send to LLM.
from agent_framework import tool
from agent_framework.azure import AzureAIProjectAgentProvider
from azure.identity import AzureCliCredential
@tool
def get_current_time(location: str) -> str:
"""Get the current time for a given location"""
...
# Make di client
provider = AzureAIProjectAgentProvider(credential=AzureCliCredential())
# Make one agent an run am wit di tool
agent = await provider.create_agent(name="TimeAgent", instructions="Use available tools to answer questions.", tools=get_current_time)
response = await agent.run("What time is it?")
Azure AI Agent Service na new agentic framework wey dey empower developers to build, deploy, and scale high-quality, extensible AI agents safely without managing compute and storage resources. E good wella for enterprise apps as e get full managed service with enterprise grade security.
Compared to developing directly with LLM API, Azure AI Agent Service get some better things, including:
Tools wey dey inside Azure AI Agent Service fit be two kinds:
Agent Service allow make we fit use all these tools together as one toolset. E also use threads to keep history of messages from the conversation.
Imagine say you be sales agent for company wey name na Contoso. You want build conversational agent wey fit answer questions about your sales data.
The picture below show how you fit use Azure AI Agent Service to analyze your sales data:

To use any of these tools with service, we fit create client then define tool or toolset. To implement this plis we fit use Python code below. LLM go fit check toolset and decide whether to use user created function, fetch_sales_data_using_sqlite_query, or pre-built Code Interpreter depending on user request.
import os
from azure.ai.projects import AIProjectClient
from azure.identity import DefaultAzureCredential
from fetch_sales_data_functions import fetch_sales_data_using_sqlite_query # fetch_sales_data_using_sqlite_query function wey you fit find for fetch_sales_data_functions.py file.
from azure.ai.projects.models import ToolSet, FunctionTool, CodeInterpreterTool
project_client = AIProjectClient.from_connection_string(
credential=DefaultAzureCredential(),
conn_str=os.environ["PROJECT_CONNECTION_STRING"],
)
# Set up tool dem
toolset = ToolSet()
# Set up function calling agent with the fetch_sales_data_using_sqlite_query function and add am join toolset
fetch_data_function = FunctionTool(fetch_sales_data_using_sqlite_query)
toolset.add(fetch_data_function)
# Set up Code Interpreter tool and add am join toolset.
code_interpreter = code_interpreter = CodeInterpreterTool()
toolset.add(code_interpreter)
agent = project_client.agents.create_agent(
model="gpt-4o-mini", name="my-agent", instructions="You are helpful agent",
toolset=toolset
)
One big worry be SQL wey LLM dey generate dynamically, especially security risks like SQL injection or bad tins wey fit happun, like drop or change database. Even though dis worry dey real, you fit reduce am well by setting database access permissions correct. For most databases, you fit set am as read-only. For database services like PostgreSQL or Azure SQL, the app suppose get read-only (SELECT) role.
If you run app inside secure environment, e go give better protection. For enterprise side, data normally dey extracted and transformed from working systems into read-only database or data warehouse with user-friendly schema. Dis way data dey protected, optimized for performance and access, plus app get limited read-only access.
Join Microsoft Foundry Discord to meet other learners, join office hours and get your AI Agents questions well answered.
Understanding Agentic Design Patterns
Disclaimer: Dis document don translate wit AI translation service Co-op Translator. Even though we dey try make am correct, abeg make you sabi say automatic translation fit get some error or wahala. Di original document wey dey dia language be di main correct one. If na important info, e better make person wey sabi human translation do am. We no go take responsibility if person no understand or get wrong meaning as dem use dis translation.