Getting Started
GenAIScript is a scripting language that integrates LLMs into the scripting process using a simplified JavaScript syntax. Supported by our VS Code GenAIScript extension, it allows users to create, debug, and automate LLM-based scripts.
Hello World
A GenAIScript is a JavaScript program that builds an LLM which is then executed by the GenAIScript runtime.
Let’s start with a simple script that tells the LLM to generate a poem. In typical use, GenAIScript files
have the naming convention <scriptname>.genai.mjs
and are stored in the genaisrc
directory
in a repository. Let’s call this script poem.genai.mjs
.
The $...
syntax is template literal
that renders to a user message to the LLM prompt. In this example, it would be:
In practice, your script may also import system scripts (automatically or manually specified) that add more messages to the requests. So the final JSON payload sent to the LLM server might look more like this:
GenAIScripts can be executed from the command line or run with a right-click context menu selection inside Visual Studio Code. Because a GenAIScript is just JavaScript, the execution of a script follows the normal JavaScript evaluation rules. Once the script is executed, the generated messages are sent to the LLM server, and the response is processed by the GenAIScript runtime.
Here is an example output for this prompt (shortened) that got returned by OpenAI gpt-4o.
GenAIScript supports extracting structured data and files from the LLM output as we will see later.
Variables
GenAIScripts support a way to declare prompt variables, which allow to include content into the prompt and to refer to it later in the script.
Let’s take a look at a summarize
script that includes the content of a file and asks the LLM to summarize it.
In this snippet, we use workspace.readText
to read the content of a file (path relatie to workspace root)
and we use def
to include it in the prompt as a prompt variable
. We then “referenced” this variable in the prompt.
The def
function supports many configuration flags to control how the content is included in the prompt. For example, you can insert line numbers or limit the number of tokens.
Files parameters
GenAIScript are meant to work on a file or set of files. When you run a script in Visual Studio Code on a file or a folder, those files are passed to the script using the env.files
variable. You can use this env.files
to replace hard-coded paths and make your scripts
more resuable.
And now apply it to a bunch of files
Processing outputs
GenAIScript processes the outputs of the LLM and extracts files, diagnostics and code sections when possible.
Let’s update the summarizer script to specify an output file pattern.
Given this input, the model returns a string, which the GenAIScript runtime interprets based on what the prompt requested from the model:
Because the prompt requested that a file be written,
the model has responded with content describing the contents of the file that should be created.
In this case, the model has chosen to call that file markdown-small.txt.summary
.
Our GenAIScript library parses the LLM output, interprets it, and in this case will create the file. If the script is invoked in VS Code, the file creation is exposed to the user via a Refactoring Preview or directly saved to the file system.
Of course, things can get more complex - with functions, schemas, … -, but this is the basic flow of a GenAIScript script. If you’re looking for an exhaustive list of prompting techniques, checkout the prompt report.
Using tools
Tools are a way to register JavaScript callbacks with the LLM, they can be used
execute code, search the web, … or read files!
Here is an example of a script that uses the fs_read_file
tool to read a file and summarize it:
A possible trace looks like as follows.
Trace
As you can see we are not using the def
function anymore, we expect the LLM to issue a call to the fs_read_file
tool to read the file markdown.md
so that it receives the content of that file.
Note that this approach is less deterministic than using def
as the LLM might not call the tool. Moreover it uses more tokens as the LLM has to generate the code to call the tool. Nonetheless, it is a powerful way to interact with the LLM.
Using agents
You can add one more layer of indirection and use agent_fs, a file system agent, to read the file. The agent combines a call to an LLM, and a set of tools related to file system queries.
Trace
Next steps
While GenAIScripts can be written with any IDE and run from the command line, users of the extension in Visual Studio Code greatly benefit from the additional support for writing, debugging, and executing GenAIScript provided. We strongly recommend starting by installing the extension.