Skip to content

LLaMA.cpp

LLaMA.cpp also allow running models locally or interfacing with other LLM vendors.

  1. Update the .env file with the local server information.

    .env
    OPENAI_API_BASE=http://localhost:...