Skip to content
A small, square digital illustration in 8-bit flat style showing a simplified computer window. Inside are separated areas formed by rectangles and circles, each sectioned with bold, bright colors to represent different configuration settings—model, tokens, temperature, and group options—depicted with shapes like sliders, toggles, and labeled blocks. The design is clean with no text, people, backgrounds, or visual effects, emphasizing a clear, easy-to-distinguish layout.

Metadata

Prompts use script({ ... }) function call to configure the title and other user interface elements.

The call to script is optional and can be omitted if you don’t need to configure the prompt. However, the script argument should a valid JSON5 literal as the script is parsed and not executed when mining metadata.

The title, description and group are (optionally) used in the UI to display the prompt.

script({
title: "Shorten", // displayed in UI
// also displayed but grayed out:
description:
"A prompt that shrinks the size of text without losing meaning",
group: "shorten", // see Inline prompts later
})

Override the system prompts included with the script. The default set of system prompts is inferred dynamically from the script content.

script({
...
system: ["system.files"],
})

You can specify the LLM model identifier in the script. The IntelliSense provided by genaiscript.g.ts will assist in discovering the list of supported models. Use large and small aliases to select default models regardless of the configuration.

script({
...,
model: "openai:gpt-4o",
})

You can specify the LLM maximum completion tokens in the script. The default is unspecified.

script({
...,
maxTokens: 2000,
})

Limits the amount of allowed function/tool call during a generation. This is useful to prevent infinite loops.

script({
...,
maxToolCalls: 100,
})

You can specify the LLM temperature in the script, between 0 and 2. The default is 0.8.

script({
...,
temperature: 0.8,
})

You can specify the LLM top_p in the script. The default is not specified

script({
...,
top_p: 0.5,
})

For some models, you can specify the LLM seed in the script, for models that support it. The default is unspecified.

script({
...,
seed: 12345678,
})
  • unlisted: true, don’t show it to the user in lists. Template system.* are automatically unlisted.

See genaiscript.d.ts in the sources for details.

You can consult the metadata of the top level script in the env.meta object.

const { model } = env.meta

Use the host.resolveModel function to resolve a model name or alias to its provider and model name.

const info = await host.resolveModel("large")
console.log(info)
{
"provider": "openai",
"model": "gpt-4o"
}