LocalAI
LocalAI act as a drop-in replacement REST API that’s compatible with OpenAI API specifications for local inferencing. It uses free Open Source models and it runs on CPUs.
LocalAI acts as an OpenAI replacement, you can see the model name mapping
used in the container, like gpt-4
is mapped to phi-2
.
Install Docker. See the LocalAI documentation for more information.
Update the
.env
file and set the api type tolocalai
..env OPENAI_API_TYPE=localai
To start LocalAI in docker, run the following command:
docker run -p 8080:8080 --name local-ai -ti localai/localai:latest-aio-cpudocker start local-aidocker statsecho "LocalAI is running at http://127.0.0.1:8080"