Quickstart, local deployment
This guide assumes you’re familiar with web services, docker and OpenAI settings.
This guide assumes you have prior knowledge of web services, Docker, and OpenAI settings. In this quickstart tutorial, we will set up the service and demonstrate how to use the Memory API from Python, .NET, Java and a Bash command line.
Requirements
- .NET 6 or higher
- Either an OpenAI API Key or Azure OpenAI deployment. If you are familiar with Ollama you can also use a local model such as Microsoft phi3 and Meta LLama. However, this may result in slower AI code execution, depending on your device.
- A vector database, such as Azure AI Search, Qdrant, or Postgres+pgvector. For basic tests, you can use KM SimpleVectorDb.
- A copy of the KM repository.
Next: Create a configuration file
Other examples
The repository contains more documentation and examples, here’s some suggestions: