AI & ML Academy - Azure OpenAI
Welcome to the AI & ML Academy (AIA) - Azure OpenAI!
This section includes Azure OpenAI sample code, architecture examples, end-to-end scenarios, notebooks, and other QuickStart resources.
Also explore our Azure OpenAI resources page📖
We’ll start by covering the Prompt Engineering Techniques, which is the first step to LLM behavior and interactions
Prompt Engineering Techniques
- System Message: Setting the Stage
The System Message acts as the opening act for our AI model. It sets the tone and context. Consider this example:
- System Message: “You’re an AI assistant that helps people find information and responds in rhyme.”
- When a user asks, “What’s the capital of France?” our AI responds with:
- “In Paris, the Seine does dance, where croissants and baguettes enhance.”
Remember, the System Message shapes the AI’s behavior. Experiment with different prompts to fine-tune your model’s personality.
- Basic Best Practices for Prompting
When crafting prompts, follow these best practices:
1. Put Instructions at the Beginning:
- Clearly state what you want the model to do.
Example: Summarize the following text, highlighting the main point.
Text: """
{input}
"""
2. Be Explicit:
- Specify the desired format or output.
- Example: “Summarize the article in 3 sentences, highlighting the key figures.”
3. Avoid Negations:
- Instead of saying what not to do, focus on positive instructions.
- Example: “Provide a concise summary” (instead of “Don’t be verbose”).
4. Use Examples:
- Show the model what you expect.
- Example: “Translate the following: ‘Bonjour’” (with the expected output: “Hello”).
5. Iterate and Experiment:
- Fine-tune prompts based on model responses.
- Observe how different instructions impact results.
- Zero-Shot Prompting
- Definition:
- Zero-shot prompting tests the model’s ability to produce relevant outputs without relying on prior examples.
- You provide a prompt that is not part of the training data, and the model generates a result given your instructions.
- Example:
- Imagine asking the model: “Translate the following English text to Spanish: ‘Good morning!’”
- The model responds with: “¡Buenos días!”
- Few-Shot Prompting (In-Context Learning)
- Definition:
- Few-shot prompting gives the model a few sample outputs (shots) to help it learn what the user wants.
- By providing context, the learning model better understands the desired output.
- Example:
- Extract keywords from the corresponding texts below.
- Azure OpenAI and Language Models
- Text: “Azure OpenAI offers powerful language models that excel at understanding and generating text. Our API provides seamless access to these models, enabling developers to tackle a wide range of language-related tasks.”
- Keywords: Azure OpenAI, language models, text generation, API
- Integration with Web Applications
- Text: “Integrating Azure OpenAI into web applications is straightforward. Developers can leverage our APIs to enhance natural language understanding, sentiment analysis, and chatbot interactions.”
- Keywords: Azure OpenAI, web applications, APIs, natural language understanding, sentiment analysis, chatbots
- Custom Text (User-Specified)
- Text: {text}
- Keywords:
The model learns from these examples and responds similarly.
- Chain-of-Thought (CoT) Prompting
- Definition:
- CoT prompting encourages reasoning and multi-step thinking.
- A few sample questions and answers, followed by the actual question (Few-Shot prompt), for the model can help it generate reasoning to respond to the prompt.
- Example:
- “Let’s think step-by-step: Roger started with 5 balls. 2 cans of 3 tennis balls each is 6 tennis balls. 5 + 6 = ?”
- Generated Response: “Roger started with 5 balls. Adding the 6 tennis balls from the cans, he now has 11 balls.”
-
Parameters
-
Model- Based on what model is used, you may expect different results and latency. For example, GPT 3.5 turbo will return faster than a GPT-4 model, but GPT-4 has better reasoning, so the resulting text could be better in more complex scenarios. The Azure AI studio and Hugging Face have benchmarking to show the difference between performances for certain tasks.
-
Temperature & Top_p- These metrics can determine how varied the response will be. The metrics can be a number between 0 to 1, with the closer the temperature is set 1 the more creative/random, and with it set to 0 more grounded and great for use cases where factual data is needed. Both Temperature and Top_p are similar in the fact that they control randomness, but they do it in a different way. The general recommendation is to alter only one out of the pair.
-
-
Grounding
- The best way to get reliable answers is by providing your model context. For example, if you were to ask a question like, “Who is responsible to project A on my team?”, the model would have no idea on how to answer that. Provided context “Sally is in charge of project A, Sam for project B, and Steve for project C”, the model is able to respond back with an answer grounded by this context.
Getting Started
Start here for an introduction to programmatically utilizing GPT models.
Name | Description |
---|---|
QuickStart | A collection of notebooks where you can quickly start with using GPT (such as creating resources, code generation, prompt engineering, LLM chain demo) |
Demo-ready Resources
Below is a table of official Azure OpenAI Accelerators and workshops from the OpenAI Accelerators and Demo Assets repository.
Name | Description |
---|---|
OpenAI Cookbook - Examples | Example code and guides for accomplishing common tasks with the OpenAI API. See the README for information on running the examples. Most code examples are written in Python, though the concepts can be applied in any language. |
Azure OpenAI Samples | Resources to help you understand how to use GPT (Generative Pre-trained Transformer) offered by Azure OpenAI at the fundamental level, explore sample end-to-end solutions, and learn about various use cases. |
Azure OpenAI Workshop | In this workshop, you will learn how to use the Azure OpenAI service to create AI powered solutions. You will get hands-on experience with the latest AI technologies and will learn how to use Azure OpenAI API. |
Semantic Kernel | Semantic Kernel (SK) is a lightweight SDK enabling integration of AI Large Language Models (LLMs) with conventional programming languages. |
Visual ChatGPT | Visual ChatGPT connects ChatGPT and a series of Visual Foundation Models to enable sending and receiving images during chatting. |
Notebooks
Examples of applications applied to various industries in complete Jupyter Notebook format in conjunction with additional Azure services.
Name | Application | Description | Components |
---|---|---|---|
Income Statement Analysis | Text/Code generation | • Read a table into a Pandas DataFrame and generate code to analyze insights • Generate 2-3 tag lines based on the podcast content. |
• Davinci-002 • Form Recognizer |
Form Recognizer Examples | Text generation | • Validate format of text extracted from Form Recognizer • Correct formats of text |
• Form Recognizer |
Customer Service Call | Summarization, Text generation | • Summarize call • Generate list of follow up tasks |
• Cognitive Services Language |
Loan Call | Summarization, Text generation | • Identify and extract PII • OpenAI validates whether extracted text is PII or normal conversation • Summarize call • Generate list of follow up tasks |
• Cognitive Services Language |
Pharmacy Call | Summarization, Text generation | • Summarize call • Provide list of medications, dose, and form discussed in the call |
• Cognitive Services Language |
Conversation SSML | Summarization, Text generation | • Summarize call from audio file • List call participants • Generate list of follow up tasks • Extract components of conversation from transcript based on keys ("reason," "cause," caller") |
• Cognitive Services Language |
End-to-end Solutions
A collection of solution accelerators (repositories) that show you how to create a robust, end-to-end Azure solution that leverages OpenAI.
Name | Application | Description | Components |
---|---|---|---|
Business Process Automation | Summarization, Search | Creates pipelines to analyze text and audio datasets, across multiple cognitive services, and the Hugging Face library. The accelerator deploys all of the resources, and transforms the input data at each step, allowing multiple Cognitive Services to be called and deployed within a single, end-to-end pipeline. Includes capabilities like Azure OpenAI (summarization or custom prompts) and integration with CosmosDB, Cognitive Search, and RediSearch for Vector Search |
• Cognitive Services (Speech, Language, Form Recognizer, Read API) • Cognitive Search • Azure Machine Learning • CosmosDB |
ChatGPT + Enterprise data with Azure OpenAI and Cognitive Search | Search | This sample demonstrates a few approaches for creating ChatGPT-like experiences over your own data using the Retrieval Augmented Generation pattern. It uses Azure OpenAI Service to access the ChatGPT model (gpt-35-turbo), and Azure Cognitive Search for data indexing and retrieval. NOTE: sample created by Product Group |
• GPT/ChatGPT • App UX • App Server • Azure SQL • Azure Cosmos DB |
Accelerator powered by Azure Cognitive Search + Azure OpenAI | Search | The goal of the MVP workshop is to show/prove the value of a Smart Search Engine built with the Azure Services, with your own data in your own environment. This repository comes with a PPT deck for a client 2-day workshop. |
• Cognitive Search • Cognitive Services (Text Analytics, Translator, Computer Vision) • OpenAI Embedding and Completion models • Cosmos DB • LangChain • Web App |
Summarization Python OpenAI | Search, Summarization | E2E ready to deploy solution architecture combining Cognitive (Enterprise) Search with OpenAI in a Python Notebook to search for relevant information and then to summarize the content to present to the user in a concise and succinct manner. |
• Blob Storage • Cognitive Search • OpenAI Embedding and Completion models (GPT-3) |
Knowledge Mining with Azure OpenAI | Search | The purpose of this repo is to accelerate the deployment of a Python-based Knowledge Mining solution with OpenAI that will ingest a Knowledge Base, generate embeddings using the contents extracted, store them in a vector search engine (Redis), and use that engine to answer queries / questions specific to that Knowledge Base. |
• Includes a Cognitive Search component and Power Virtual agent bot using ChatGPT • Form Recognizer • Event Grid • Cognitive Search • Azure Function • Cosmos DB • Service Bus • Cognitive Services • Translator • Redis |
ChatGPT Retrieval Plugin | Search | The ChatGPT Retrieval Plugin repository provides a flexible solution for semantic search and retrieval of personal or organizational documents using natural language queries. Includes examples and documentation for usage. |
• ChatGPT |
Build your first AOAI application with PowerApps | Summarization, Text Generation, Search | Submit prompts to OpenAI from Power App using the OpenAI Python SDK. |
• Power App • LangChain • Azure OpenAI Embeddings API |
A natural language query application on SQL data (Advanced) | Code/Text Generation | This scenario allows users to use Open AI as an intelligent agent to get business questions prompts from end users and generating SQL queries from the prompts. This implementation scenario focuses on building a Natural Language to query from business questions and generate the queries for database retrieval |
• Power App • Azure Function • Azure SQL |
Build an Open AI Pipeline to Ingest Batch Data, Perform Intelligent Operations, and Analyze in Synapse (Advanced) | Summarization, Search | This scenario allows OpenAI to summarize and analyze customer service call logs for the fictitious company, Contoso. The data is ingested into a blob storage account, and then processed by an Azure Function. The Azure Function will return the customer sentiment, product offering the conversation was about, the topic of the call, as well as a summary of the call. These results are written into a separate designated location in the Blob Storage. From there, Synapse Analytics is utilized to pull in the newly cleansed data to create a table that can be queried in order to derive further insights. |
• Synapse Analytics • Blob Storage • Azure Function |
Using Azure OpenAI on custom dataset (Advanced) | Q&A, Search, Text Generation | This scenario allows use cases to use Open AI as an intelligent agent to answer questions from end users or assist them using knowledge of a proprietary corpus and domain for a variety of applications |
• Power App • Form Recognizer • Azure Cognitive Search • Azure Function |