Skip to content

LiteLLM

The LiteLLM proxy gateway provides a OpenAI compatible API for running models locally. Configure the LITELLM_... keys to set the key and optionally the base url.

Use the litellm provider.

.env
LITELLM_API_KEY="..."
#LITELLM_API_BASE="..."