
Getting Started
GenAIScript is a scripting language that integrates LLMs into the scripting process using a simplified JavaScript syntax. Supported by our VS Code GenAIScript extension, it allows users to create, debug, and automate LLM-based scripts.
Preamble
Section titled “Preamble”Before you start writing GenAIScripts, you will need to configure your environment to have access to a LLM. The configuration covers this topic in details as they are a lot of options to consider.
Hello World
Section titled “Hello World”A GenAIScript is a JavaScript program that builds an LLM which is then executed by the GenAIScript runtime.
Let’s start with a simple script that tells the LLM to generate a poem. In typical use, GenAIScript files
have the naming convention <scriptname>.genai.mjs
and are stored in the genaisrc
directory
in a repository. Let’s call this script poem.genai.mjs
.
$`Write a poem in code.`
The $...
syntax is template literal
that renders to a user message to the LLM prompt. In this example, it would be:
Write a poem in code.
In practice, your script may also import system scripts (automatically or manually specified) that add more messages to the requests. So the final JSON payload sent to the LLM server might look more like this:
{ ... messages: [ { role: "system", content: "You are helpful. ..." }, { role: "user", content: "Write a poem in code." } ]}
GenAIScripts can be executed from the command line or run with a right-click context menu selection inside Visual Studio Code. Because a GenAIScript is just JavaScript, the execution of a script follows the normal JavaScript evaluation rules. Once the script is executed, the generated messages are sent to the LLM server, and the response is processed by the GenAIScript runtime.
npx genaiscript run poem
pnpx genaiscript run poem
yarn dlx genaiscript run poem
Here is an example output for this prompt (shortened) that got returned by OpenAI gpt-4o.
```pythondef poem(): # In the silence of code, ...# And thus, in syntax sublime,# We find the art of the rhyme.```
GenAIScript supports extracting structured data and files from the LLM output as we will see later.
Variables
Section titled “Variables”GenAIScripts support a way to declare prompt variables, which allow to include content into the prompt and to refer to it later in the script.
Let’s take a look at a summarize
script that includes the content of a file and asks the LLM to summarize it.
def("FILE", workspace.readText("some/relative/markdown.txt"))$`Summarize FILE in one sentence.`
In this snippet, we use workspace.readText
to read the content of a file (path relatie to workspace root)
and we use def
to include it in the prompt as a prompt variable
. We then “referenced” this variable in the prompt.
FILE:
```text file="some/relative/markdown.txt"What is Markdown?
Markdown is a lightweight markup language that you can use to add formatting elements to plaintext text documents. Created by John Gruber in 2004, Markdown is now one of the world’s most popular markup languages.```
Summarize FILE in one sentence.
The def
function supports many configuration flags to control how the content is included in the prompt. For example, you can insert line numbers or limit the number of tokens.
def("FILE", ..., { lineNumbers: true, maxTokens: 100 })
Files parameters
Section titled “Files parameters”GenAIScript are meant to work on a file or set of files. When you run a script in Visual Studio Code on a file or a folder, those files are passed to the script using the env.files
variable. You can use this env.files
to replace hard-coded paths and make your scripts
more resuable.
// summarize all files in the env.files arraydef("FILE", env.files)$`Summarize FILE in one sentence.`
And now apply it to a bunch of files
npx genaiscript run summarize "**/*.md"
pnpx genaiscript run summarize "**/*.md"
yarn dlx genaiscript run summarize "**/*.md"
Processing outputs
Section titled “Processing outputs”GenAIScript processes the outputs of the LLM and extracts files, diagnostics and code sections when possible.
Let’s update the summarizer script to specify an output file pattern.
// summarize all files in the env.files arraydef("FILE", env.files)$`Summarize each FILE in one sentence. Save each generated summary to "<filename>.summary"`
Given this input, the model returns a string, which the GenAIScript runtime interprets based on what the prompt requested from the model:
File src/samples/markdown-small.txt.summary:
```textMarkdown is a lightweight markup language created by John Gruber in 2004, known for adding formatting elements to plaintext text documents.```
Because the prompt requested that a file be written,
the model has responded with content describing the contents of the file that should be created.
In this case, the model has chosen to call that file markdown-small.txt.summary
.
Our GenAIScript library parses the LLM output, interprets it, and in this case will create the file. If the script is invoked in VS Code, the file creation is exposed to the user via a Refactoring Preview or directly saved to the file system.
Of course, things can get more complex - with functions, schemas, … -, but this is the basic flow of a GenAIScript script. If you’re looking for an exhaustive list of prompting techniques, checkout the prompt report.
Using tools
Section titled “Using tools”Tools are a way to register JavaScript callbacks with the LLM, they can be used
execute code, search the web, … or read files!
Here is an example of a script that uses the fs_read_file
tool to read a file and summarize it:
script({ tools: "fs_read_file" })
$`- read the file markdown.md- summarize it in one sentence.- save output to markdown.md.txt`
A possible trace looks like as follows.
Trace
- prompting github:gpt-4o- cat src/rag/markdown.md- prompting github:gpt-4o
FILE ./markdown.md.txt:
```textMarkdown is a lightweight ...```
As you can see we are not using the def
function anymore, we expect the LLM to issue a call to the fs_read_file
tool to read the file markdown.md
so that it receives the content of that file.
Note that this approach is less deterministic than using def
as the LLM might not call the tool. Moreover it uses more tokens as the LLM has to generate the code to call the tool. Nonetheless, it is a powerful way to interact with the LLM.
Using agents
Section titled “Using agents”You can add one more layer of indirection and use agent_fs, a file system agent, to read the file. The agent combines a call to an LLM, and a set of tools related to file system queries.
script({ tools: "agent_fs" })
$`- read the file src/rag/markdown.md- summarize it in one sentence.- save output to file markdown.md.txt (override existing)`
Trace
- prompting github:gpt-4o (~1569 tokens)- agent fs: read and summarize file src/rag/markdown.md in one sentence - prompt agent memory query with github:gpt-4o-mini: "NO_ANSWER" - prompt agent fs with github:gpt-4o (~422 tokens) - cat src/rag/markdown.md - prompting github:gpt-4o (~635 tokens)
```mdThe file "src/rag/markdown.md" explains that Markdown...```
- prompting github:gpt-4o (~1625 tokens)
I'll save the summary to the file `markdown.md.txt`.
FILE markdown.md.txt:
```The file "src/rag/markdown.md" explains that Markdown....```
Next steps
Section titled “Next steps”While GenAIScripts can be written with any IDE and run from the command line, users of the extension in Visual Studio Code greatly benefit from the additional support for writing, debugging, and executing GenAIScript provided. We strongly recommend starting by installing the extension.