Icon for Ollama

Ollama

Run large language models locally with a single command

Open Source Free Trial

Ollama makes it easy to run open-source LLMs locally on your machine. It handles model downloading, quantization, and serving with an OpenAI-compatible API. Supports Llama, Mistral, Gemma, Phi, and many other model families. Popular for local development, testing, and offline AI applications.

Pricing: Free

Screenshot of Ollama webpage

Is your product missing? 👀 Add it here →