Icon for LiteLLM

LiteLLM

Unified OpenAI-compatible proxy for 100+ LLM providers with cost tracking and load balancing

Open Source Free Trial

LiteLLM is a Python SDK and proxy server that provides a single OpenAI-compatible interface to call over 100 LLM APIs, including OpenAI, Anthropic, Azure, Bedrock, Vertex AI, Cohere, and Ollama. It handles cost tracking, budget management, virtual API keys, guardrails, and load balancing across deployments. The proxy can handle 1,500+ requests per second. The core SDK and proxy are open source under MIT, with enterprise features available for teams needing SSO and advanced auth.

Pricing: Free / enterprise

Screenshot of LiteLLM webpage

Is your product missing? 👀 Add it here →