LiteLLM
Unified OpenAI-compatible proxy for 100+ LLM providers with cost tracking and load balancing
LiteLLM is a Python SDK and proxy server that provides a single OpenAI-compatible interface to call over 100 LLM APIs, including OpenAI, Anthropic, Azure, Bedrock, Vertex AI, Cohere, and Ollama. It handles cost tracking, budget management, virtual API keys, guardrails, and load balancing across deployments. The proxy can handle 1,500+ requests per second. The core SDK and proxy are open source under MIT, with enterprise features available for teams needing SSO and advanced auth.
Pricing: Free / enterprise
LiteLLM Alternatives
Explore 21 products in the Frameworks & Stacks category. View all LiteLLM alternatives.
Mastra
TypeScript-first AI framework for building agents, RAG pipelines, and workflows
Ollama
Run large language models locally with a single command
Hugging Face
The open-source AI platform with 500K+ models, inference endpoints, and fine-tuning tools
Is your product missing?