The runPrompt
function allows to build an inner LLM invocation. It returns the output of the prompt.
You can also shortcut the function and pass the prompt text directly
Limitations
- Nested functions are not supported in the inner prompt.
Example: Summary of file summaries using gpt-3.5
The snippet below uses gpt-3.5
to summarize files individually before
adding them to the main prompt.
Example: Summary of file summaries using Phi-3
The snippet below uses Phi-3
through Ollama to summarize files individually before adding them to the main prompt.