< Previous Challenge - Home - Next Challenge >
.env.sample
file (and save as .env
) according to your model names if you haven’t alreadyAs LLMs grow in popularity and use around the world, the need to manage and monitor their outputs becomes increasingly important. In this challenge, you will learn how to use prompt engineering techniques to generate desired results for LLMs.
Model deployment for the challenge:
gpt-4
gpt-35-turbo
NOTE: For model families currently available, please reference this link for more information: Azure OpenAI Service models.
Some models are not available for new deployments beginning July 6, 2023. Deployments created prior to July 6, 2023 remain available to customers until July 5, 2024. You may revise the environment file and the model you deploy accordingly. Please refer to the following link for more details: Azure OpenAI Service legacy models
.env
file. Please feel free to make any modifications as needed and then rename the .env-sample
file to .env
.Questions you should be able to answer by the end of this challenge:
You will run the following Jupyter notebook to complete the tasks for this challenge:
CH-01-PromptEngineering.ipynb
The file can be found in your Codespace under the /notebooks
folder.
If you are working locally or in the Cloud, you can find it in the /notebooks
folder of Resources.zip
file.
To run a Jupyter notebook, navigate to it in your Codespace or open it in VS Code on your local workstation. You will find further instructions for the challenge, as well as in-line code blocks that you will interact with to complete the tasks for the challenge. Return here to the student guide after completing all tasks in the Jupyter notebook to validate you have met the success criteria below for this challenge.
Sections in this Challenge:
Iterative Prompting Principles:
3.1 Write clear and specific instructions
3.2 Give the model time to “think”
To complete this challenge successfully: