GenAIScript
GenAIScript is a tool for generating and executing scripts using LLMs. It is used in PromptPex to generate the test generation scripts.
Try PromptPex
Section titled “Try PromptPex”- Install Node.js v22+
- Configure your LLM credentials in
.env
. You can use OpenAI, Azure OpenAI, or Ollama.
npx --yes genaiscript configure
- Launch promptpex locally
npx --yes genaiscript@latest serve --remote microsoft/promptpex --remote-branch dev
Docker
Section titled “Docker”To launch PromptPex in a docker container, first create an image with the following command:
docker build -t genaiscript -<<EOFFROM node:lts-alpineRUN apk add --no-cache git && npm install -g genaiscriptEOF
Launch promptpex using the genaiscript
image
docker run --env GITHUB_TOKEN --env-file .env --name genaiscript --rm -it --expose 8003 -p 8003:8003 -v ${PWD}:/workspace -w /workspace genaiscript genaiscript serve --network --remote microsoft/promptpex --remote-branch dev
GitHub Codespaces
Section titled “GitHub Codespaces”Use CodeSpaces / dev container to get a fully configured environment, including access to LLMs through GitHub Marketplace Models.
then launch the server
npm run serve
Local development
Section titled “Local development”- Clone this repository
- Install Node.js v22+
- Install dependencies
npm install
Configure the eval, rules, baseline aliases
Section titled “Configure the eval, rules, baseline aliases”PromptPex defines the following model aliases for the different phases of the test generation:
rules
: rule, inverse rules, test generationeval
: rule and test quality evaluationsbaseline
: baseline test generation
If you are using a specific set of models, you can use a .env
file to override the eval/rules/baseline aliases
GENAISCRIPT_MODEL_EVAL="azure:gpt-4o_2024-11-20"GENAISCRIPT_MODEL_RULES="azure:gpt-4o_2024-11-20"GENAISCRIPT_MODEL_BASELINE="azure:gpt-4o_2024-11-20"
Web interface
Section titled “Web interface”- Launch web interface
npm run serve
- Open localhost
Development
Section titled “Development”The development of PromptPex is done using GenAIScript.
- Install Node.js v22+
- Configure your LLM credentials in
.env
Typecheck scripts
Section titled “Typecheck scripts”Use Visual Studio Code to get builtin typechecking from TypeScript or
npm run build
Create a commit
Section titled “Create a commit”For convenience,
npm run gcm
- Open a
JavaScript Debug Terminal
in Visual Studio Code - Put a breakpoint in your script
- Launch the script
Upgrade dependencies
Section titled “Upgrade dependencies”npm run upgrade
Diagnostics mode
Section titled “Diagnostics mode”Set the DEBUG=promptpex:*
environment variable to enable additional logging.
DEBUG=promptpex:* npm run ...
To pipe the stderr, stdout to a file,
DEBUG=* npm run ... > output.txt 2>&1
Caching
Section titled “Caching”Add --vars cache=true
to the command line to enable caching of LLM calls.