Tools (Functions)
You can register tools (also known as functions) that the LLM may decide to call as part of assembling the answer. See OpenAI functions.
Definition
defTool
is used to define a tool that can be called by the LLM.
It takes a JSON schema to define the input and expects a string output. The LLM decides to call
this tool on its own!
defTool( "current_weather", "get the current weather", { type: "object", properties: { location: { type: "string", description: "The city and state, e.g. San Francisco, CA", }, }, required: ["location"], }, (args) => { const { location } = args if (location === "Brussels") return "sunny" else return "variable" })
In the example above, we define a tool called current_weather
that takes a location as input and returns the weather.
Weather tool example
This example uses the current_weather
tool to get the weather for Brussels.
script({ model: "openai:gpt-3.5-turbo", title: "Weather as function", description: "Query the weather for each city using a dummy weather function", temperature: 0.5, files: "src/cities.md", tests: { files: "src/cities.md", keywords: "Brussels", },})
$`Query the weather for each listed city and return the results as a table.`
def("CITIES", env.files)
defTool( "get_current_weather", "get the current weather", { type: "object", properties: { location: { type: "string", description: "The city and state, e.g. San Francisco, CA", }, }, required: ["location"], }, (args) => { const { context, location } = args const { trace } = context
trace.log(`Getting weather for ${location}...`)
let content = "variable" if (location === "Brussels") content = "sunny"
return content })
Math tool example
This example uses the math expression evaluator to evaluate a math expression.
script({ title: "math-agent", model: "gpt-35-turbo", description: "A port of https://ts.llamaindex.ai/examples/agent", parameters: { "question": { type: "string", default: "How much is 11 + 4? then divide by 3?" }, }, tests: { description: "Testing the default prompt", keywords: "5" }})
defTool("sum", "Use this function to sum two numbers", { type: "object", properties: { a: { type: "number", description: "The first number", }, b: { type: "number", description: "The second number", }, }, required: ["a", "b"],}, ({ a, b }) => `${a + b}`)
defTool("divide", "Use this function to divide two numbers", { type: "object", properties: { a: { type: "number", description: "The first number", }, b: { type: "number", description: "The second number", }, }, required: ["a", "b"],}, ({ a, b }) => `${a / b}`)
$`Answer the following arithmetic question:
${env.vars.question}`
Packaging as System scripts
To pick and choose which tools to include in a script,
you can group them in system scripts. For example,
the current_weather
tool can be included the system.current_weather.genai.js
script.
script({ title: "Get the current weather",})defTool("current_weather", ...)
then use the script id in the tools
field.
script({ ..., tools: ["system.current_weather"],})
Builtin tools
Example
Let’s illustrate how tools come together with a question answering script.
In the script below, we add the system.retrieval_web_search
which registers the retrieval_web_search
tool. This tool
will call into retrieval.webSearch
as needed.
script({ title: "Answer questions", system: ["system", "system.retrieval_web_search"]})
def("FILES", env.files)
$`Answer the questions in FILES using a web search.
- List a summary of the answers and the sources used to create the answers.
We can then apply this script to the questions.md
file blow.
- What is weather in Seattle?- What laws were voted in the USA congress last week?
After the first request, the LLM requests to call the web_search
for each questions.
The web search answers are then added to the LLM message history and the request is made again.
The second yields the final result which includes the web search results.