Icon for Ollama

Ollama

Open Source Free Trial

Run large language models locally with a single command

Ollama makes it easy to run open-source LLMs locally on your machine. It handles model downloading, quantization, and serving with an OpenAI-compatible API. Supports Llama, Mistral, Gemma, Phi, and many other model families. Popular for local development, testing, and offline AI applications.

Pricing: Free

Hosting Cloud + Self-hosted
Pricing Freemium, from Free (open-source)
HQ 🇺🇸 United States
Founded 2023
Screenshot of Ollama webpage

Is your product missing?

Add it here →