LLaMA.cpp
LLaMA.cpp also allow running models locally or interfacing with other LLM vendors.
Update the
.envfile with the local server information..env OPENAI_API_BASE=http://localhost:...
LLaMA.cpp also allow running models locally or interfacing with other LLM vendors.
Update the .env file with the local server information.
OPENAI_API_BASE=http://localhost:...