Ollama
Open Source
Free Trial
Run large language models locally with a single command
Ollama makes it easy to run open-source LLMs locally on your machine. It handles model downloading, quantization, and serving with an OpenAI-compatible API. Supports Llama, Mistral, Gemma, Phi, and many other model families. Popular for local development, testing, and offline AI applications.
Pricing: Free
Hosting
Cloud + Self-hosted
Pricing
Freemium, from Free (open-source)
HQ
🇺🇸 United States
Founded
2023
Ollama Alternatives
Explore 61 products in the Inference APIs category. View all Ollama alternatives.
EUrouter
European AI gateway that routes to 100+ models with EU data residency
From Free (1K req/mo), 39 EUR/mo (Plus)
AKI.IO
European AI API for open-source models on EU infrastructure
Free Trial
From Free (10 EUR credits)
Jina AI
Search APIs for embeddings, reranking, and web-to-markdown conversion
Free Trial
From Free (10M tokens)
Also listed in
Is your product missing?