Phi-3 Mini with Ollama
Phi-3 Mini is a 3.8B parameters, lightweight, state-of-the-art open model by Microsoft. In this guide, we use Ollama, a desktop application that let you download and run model locally.
Start the Ollama application or run the command to launch the server from a terminal.
Terminal window ollama serve(optional) Pull your model from the Ollama server (see list of models). GenAIScript will automatically attempt to pull it if missing.
Terminal window ollama pull phi3Update your script to use the
ollama:phi3
model.summarize-phi3.genai.mjs script({model: "ollama:phi3",title: "summarize with phi3",system: ["system"],})const file = def("FILE", env.files)$`Summarize ${file} in a single paragraph.`Apply this script to the files you want to summarize!