Prompt flow#
Prompt flow is a suite of development tools designed to streamline the end-to-end development cycle of LLM-based AI applications, from ideation, prototyping, testing, evaluation to production deployment and monitoring. It makes prompt engineering much easier and enables you to build LLM apps with production quality.
With prompt flow, you will be able to:
Create flows that link LLMs, prompts, Python code and other tools together in a executable workflow.
Debug and iterate your flows, especially tracing interaction with LLMs with ease.
Evaluate your flows, calculate quality and performance metrics with larger datasets.
Integrate the testing and evaluation into your CI/CD system to ensure quality of your flow.
Deploy your flows to the serving platform you choose or integrate into your app’s code base easily.
(Optional but highly recommended) Collaborate with your team by leveraging the cloud version of Prompt flow in Azure AI.
Welcome to join us to make prompt flow better by participating discussions, opening issues, submitting PRs.
This documentation site contains guides for prompt flow sdk, cli and vscode extension users.
🚀 Quick Start
Quick start and end-to-end tutorials.
- Getting started with prompt flow
- E2E development tutorial: chat with PDF
- Find more: tutorials & samples
📒 How-to Guides
Articles guide user to complete a specific task in prompt flow.
- Tracing
- Develop a flow
- Run and evaluate a flow
- Deploy a flow
📑 Concepts
Introduction of key concepts of prompt flow.
- Flows
- Tools
- Connections
- Design principles
🔍 Reference
Reference provides technical information about prompt flow API.
- Command line Interface reference: pf
- Python library reference: promptflow
- Tool reference: LLM Tool, Python Tool, Prompt Tool
- ChangeLog: ChangeLog