You can register tools (also known as functions) that the LLM may decide to call as part of assembling the answer.
See OpenAI functions.
Definition
defTool is used to define a tool that can be called by the LLM.
It takes a JSON schema to define the input and expects a string output. The LLM decides to call
this tool on its own!
In the example above, we define a tool called current_weather
that takes a location as input and returns the weather.
Weather tool example
This example uses the current_weather tool to get the weather for Brussels.
To pick and choose which tools to include in a script,
you can group them in system scripts. For example,
the current_weather tool can be included the system.current_weather.genai.mjs script.
then use the script id in the tools field.
Builtin tools
fs_ask_fileRuns a LLM query over the content of a file. Use this tool to extract information from a file.
fs_diff_filesComputes a diff between two different files. Use git diff instead to compare versions of a file.
fs_find_filesFinds file matching a glob pattern. Use pattern to specify a regular expression to search for in the file content. Be careful about asking too many files.
fs_read_fileReads a file as text from the file system. Returns undefined if the file does not exist.
md_find_filesGet the file structure of the documentation markdown/MDX files. Retursn filename, title, description for each match. Use pattern to specify a regular expression to search for in the file content.
meta_promptTool that applies OpenAI's meta prompt guidelines to a user prompt. Modified from https://platform.openai.com/docs/guides/prompt-generation?context=text-out.
meta_schemaGenerate a valid JSON schema for the described JSON. Source https://platform.openai.com/docs/guides/prompt-generation?context=structured-output-schema.
node_testbuild and test current project using `npm test`
python_code_interpreter_runExecutes python 3.12 code for Data Analysis tasks in a docker container. The process output is returned. Do not generate visualizations. The only packages available are numpy, pandas, scipy. There is NO network connectivity. Do not attempt to install other packages or make web requests. You must copy all the necessary files or pass all the data because the python code runs in a separate container.
Let’s illustrate how tools come together with a question answering script.
In the script below, we add the system.retrieval_web_search which registers the retrieval_web_search tool. This tool
will call into retrieval.webSearch as needed.
We can then apply this script to the questions.md file blow.
After the first request, the LLM requests to call the web_search for each questions.
The web search answers are then added to the LLM message history and the request is made again.
The second yields the final result which includes the web search results.