SambaNova
Custom AI chip inference platform with purpose-built hardware for high-throughput LLM serving
Free Trial
SambaNova builds custom AI inference chips (SN40L and SN50) and offers SambaNova Cloud, an inference API for running large language models. Their purpose-built chips deliver significantly higher throughput than NVIDIA GPUs, claiming 5.7x faster than H200 on DeepSeek R1. The cloud API is OpenAI-compatible and supports Llama 3.1 (8B, 70B, 405B) and DeepSeek R1. Free tier includes $5 in credits. Developer and enterprise tiers available with pay-as-you-go token-based pricing.
Pricing: Per token usage
SambaNova Alternatives
Explore 31 products in the Inference APIs category. View all SambaNova alternatives.
Is your product missing? 👀 Add it here →