Programmatically assemble prompts for LLMs using JavaScript. Orchestrate LLMs, tools, and data in a single script.
JavaScript toolbox to work with prompts
Abstraction to make it easy and productive
Seamless Visual Studio Code integration
Hello world
Say to you want to create an LLM script that generates a ‘hello world’ poem. You can write the following script:
The $ function is a template tag that creates a prompt. The prompt is then sent to the LLM (you configured), which generates the poem.
Let’s make it more interesting by adding files, data, and structured output. Say you want to include a file in the prompt, and then save the output in a file. You can write the following script:
The def function includes the content of the file, and optimizes it if necessary for the target LLM. GenAIScript script also parses the LLM output
and will extract the data.json file automatically.
Bicep Best PracticesLearn how to apply best practices to Azure Bicep files for more efficient and maintainable infrastructure as code.
SEO Front MatterLearn how to automate the creation of SEO-optimized front matter for your markdown documents with GenAIScript.
Documentation TranslationsExplore the challenges and solutions for localizing MakeCode documentation with custom macros while maintaining rich rendering in multiple languages.
Blocks LocalizationLearn how to localize MakeCode programming blocks while preserving block properties and variable names for international audiences.
Release NotesGenerate comprehensive release notes combining commit history and code diffs
TLA+ AI LinterExplore how the TLA+ AI Linter leverages GenAI scripts and LLMs to enhance TLA+ specifications with automated linting and consistent comment verification.
Image Alt TextLearn how to automatically generate descriptive alt text for images using OpenAI Vision model to enhance accessibility and SEO.
Containerized ToolsLearn how to create and use containerized tools with executable dependencies in a secure environment using GCC as an example.
Generated KnowledgeExplore the technique of generated knowledge in AI prompting to enhance accuracy in answering questions.
Phi-3 Mini with OllamaLearn how to integrate Phi-3 Mini, a powerful 3.8B parameter model by Microsoft, with Ollama for local execution of state-of-the-art AI models.
agent user_inputask user for input to confirm, select or answer the question in the query. The message should be very clear and provide all the context.