Skip to content

LLM as a tool

It is possible tools and inline prompts to create a tool that uses an LLM model to execute a prompt.

defTool(
"llm-gpt35",
"Invokes gpt-3.5-turbo to execute a LLM request",
{
prompt: {
type: "string",
description: "the prompt to be executed by the LLM",
},
},
async ({ prompt }) =>
await runPrompt(prompt, {
model: "openai:gpt-3.5-turbo",
label: "llm-gpt35",
})
)

The inlined prompts can declare their own tools or use system prompts declaring them.

defTool(
"agent_file_system",
`An agent that uses gpt-4o to execute an LLM requests with tools that can search and read the file system.
`,
{
prompt: {
type: "string",
description: "the prompt to be executed by the LLM",
},
},
async ({ prompt }) =>
await env.generator.runPrompt(
(_) => {
_.$`You are an AI assistant that can help with file system tasks.
Answer the user question in the most concise way possible. Use wildcards and regex if needed.
If the question is ambiguous, ask for clarification.
Use tools to search and read the file system.
QUESTION:`
_.writeText(prompt)
},
{
model: "openai:gpt-4o",
label: `llm-4o agent_fs ${prompt}`,
tools: "fs",
}
)
)