LiteLLM
The LiteLLM proxy gateway provides a OpenAI compatible API for running models locally.
Configure the LITELLM_... keys to set the key and optionally the base url.
Use the litellm provider.
LITELLM_API_KEY="..."#LITELLM_API_BASE="..."The LiteLLM proxy gateway provides a OpenAI compatible API for running models locally.
Configure the LITELLM_... keys to set the key and optionally the base url.
Use the litellm provider.
LITELLM_API_KEY="..."#LITELLM_API_BASE="..."