Skip to content

LocalAI

LocalAI act as a drop-in replacement REST API that’s compatible with OpenAI API specifications for local inferencing. It uses free Open Source models and it runs on CPUs.

LocalAI acts as an OpenAI replacement, you can see the model name mapping used in the container, like gpt-4 is mapped to phi-2.

  1. Install Docker. See the LocalAI documentation for more information.

  2. Update the .env file and set the api type to localai.

    .env
    OPENAI_API_TYPE=localai

To start LocalAI in docker, run the following command:

Terminal window
docker run -p 8080:8080 --name local-ai -ti localai/localai:latest-aio-cpu
docker start local-ai
docker stats
echo "LocalAI is running at http://127.0.0.1:8080"