Skip to content

Blog

Fallback Tools

Tools is a powerful feature of LLM models that allows you to augment the LLM reasoning with external tools.

These days, many LLM models come with a built-in support for tools. However, some of them don’t… like OpenAI’s o1-preview and o1-mini.

Fallback tools

With GenAIScript 1.72.0, we introduce the concept of fallback tools. Basically, it consists of a system script that “teaches” the LLM model about available tools and how to call them.

$`## Tool support
You can call external tools to help generating the answer of the user questions.
- The list of tools is defined in TOOLS. Use the description to help you choose the best tools.
- Each tool has an id, description, and a JSON schema for the arguments.
...
\`\`\`tool_calls
<tool_id>: { <JSON_serialized_tool_call_arguments> }
<tool_id_2>: { <JSON_serialized_tool_call_arguments_2> }
...
\`\`\`

A tool example

Here is an example of a tool that generates a random number between 0 and 1.

defTool("random", "Generate a random number", {}, () => Math.random())
$`Generate a random number between 0 and 1.`
  • o1-mini trace (using GitHub Models)
prompting github:o1-mini (~490 tokens)
```tool_calls
random: {}
```
prompting github:o1-mini (~532 tokens)
Your random number between 0 and 1 is **0.7792901036554349**.
  • gemma2 model (using Ollama)
prompting ollama:gemma2 (~716 tokens)
```tool_calls
random: {}
```
prompting ollama:gemma2 (~758 tokens)
The random number is 0.9552638470626966.
Let me know if you'd like to generate another random number!

Activation

The fallback tool mode is automatically activated for known LLM models that don’t support tools natively. The list is not complete so open an issue if you stumble upon a model that should have fallback tools enabled.

It can also be activated manually (see documentation).

LLM Agents

GenAIScript defines an agent as a tool that runs an inline prompt to accomplish a task. The agent LLM is typically augmented with additional tools.

In this blog post, we’ll walk through building a user interaction agent that enables the agent to ask questions to the user.

script({
tools: ["agent_user_input"],
})
$`
Imagine a funny question and ask the user to answer it.
From the answer, generate 3 possible answers and ask the user to select the correct one.
Ask the user if the answer is correct.
`

Let’s dive into understanding how to create an “Agent that can ask questions to the user.”

You can find the full script on GitHub right here.

Metadata

The script is written in JavaScript. It starts by declaring the metadata to make the script available as a system script, which can be reused in other scripts.

system.agent_user_input.genai.mjs
system({
title: "Agent that can ask questions to the user.",
})

This line sets up the title for our system, making it clear that it’s intended to interact with the user by asking questions.

title and description

The defAgent function defines the behavior of our agent. It takes an agent identifier and a description. These two are quite important, as they will help the “host” LLM choose to use this agent.

defAgent(
"user_input",
"Ask user for input to confirm, select or answer a question.",
...

GenAIScript will automatically append a description of all the tools used by the agent prompt so you don’t have to worry about that part in the description.

prompt

The third argument is a string or a function to craft prompt instructions for the agent LLM call. The agent implementation already contains generic prompting to make the prompt behave like an agent, but you can add more to specify a role, tone, and dos and don’ts.

defAgent(
...,
`You are an agent that can ask questions to the user and receive answers. Use the tools to interact with the user.
- the message should be very clear. Add context from the conversation as needed.`,
...

model configuration

The last argument is a set of model options, similar to runPrompt, to configure the LLM call made by the agent. In particular, this is where you list the tools that the agent can use.

defAgent(
..., {
tools: ["user_input"],
}
)

How to use the agent

The agent is used like any other tool by referencing it in the script options.

script({
tools: ["agent_user_input"]
})
...

Let’s try it!

Let’s try the agent with:

script({
tools: ["agent_user_input"],
})
$`Imagine a funny question and ask the user to answer it.
From the answer, generate 3 possible answers and ask the user to select the correct one.
Ask the user if the answer is correct.`

and let’s look at the results…

prompting openai:gpt-4o (~150 tokens)
agent user_input: What would be the most unexpected thing to find inside a refrigerator?
run prompt agent user_input
prompting openai:gpt-4o (~234 tokens)
user input text: What would be the most unexpected thing to find inside a refrigerator?

✔ What would be the most unexpected thing to find inside a refrigerator? toaster

prompting openai:gpt-4o (~240 tokens)
toaster
prompting openai:gpt-4o (~156 tokens)
agent user_input: Based on your answer, which of the following would also be unexpected to find inside a refrigerator?
1. A television
2. A penguin
3. A snowman
Please select the correct answer.
run prompt agent user_input
prompting openai:gpt-4o (~263 tokens)
user input select: Based on your answer, which of the following would also be unexpected to find inside a refrigerator?

✔ Based on your answer, which of the following would also be unexpected to find inside a refrigerator? A television

prompting openai:gpt-4o (~269 tokens)
A television
prompting openai:gpt-4o (~162 tokens)
agent user_input: Is your selection of 'A television' the correct unexpected item to find inside a refrigerator?
run prompt agent user_input
prompting openai:gpt-4o (~239 tokens)
user input confirm: Is your selection of 'A television' the correct unexpected item to find inside a refrigerator?

✔ Is your selection of ‘A television’ the correct unexpected item to find inside a refrigerator? yes

prompting openai:gpt-4o (~244 tokens)
true
prompting openai:gpt-4o (~167 tokens)
Great choice! A television inside a refrigerator would indeed be quite unexpected.

Search and Transform

Have you ever found yourself in a situation where you need to search through multiple files in your project, find a specific pattern, and then apply a transformation to it? It can be a tedious task, but fear not! In this blog post, I’ll walk you through a GenAIScript that does just that, automating the process and saving you time. 🕒💡

For example, when GenAIScript added the ability to use a string command string in the exec command, we needed to convert all script using

host.exec("cmd", ["arg0", "arg1", "arg2"])

to

host.exec(`cmd arg0 arg1 arg2`)`

The Search And Transform guide covers the detail on this new approach…