Ollama
Open Source MITRun large language models locally with ease
Description
Ollama allows you to run large language models such as Llama, Mistral, Gemma, and many more locally on your own machine. Installation is remarkably simple, and models can be downloaded and started with a single command. Ollama provides a REST API and is compatible with the OpenAI API, making integration into existing applications straightforward.
Preview
Platforms
Linux Windows macOS Docker
Replaces the following proprietary tools
ChatGPT → Ollama
Midjourney → Ollama
DALL-E → Ollama
Self-Hosting
Ollama can be self-hosted on your own infrastructure. Visit the official documentation for installation instructions.
Documentation →Categories
Tags
#llm
#lokal
#sprachmodelle
#rest-api
#go
Similar Tools
LocalAI
Local OpenAI-compatible API for various AI models
MIT ●● Medium Self-hostable
ChatGPT Midjourney DALL-E
Last: Feb 2026 26.0k
View details →
Stable Diffusion WebUI
Powerful user interface for AI image generation
AGPL-3.0 ●● Medium Self-hostable
ChatGPT Midjourney DALL-E
Last: Feb 2026 145.0k
View details →
Text Generation WebUI
Versatile interface for local language models
AGPL-3.0 ●● Medium Self-hostable
ChatGPT Midjourney DALL-E
Last: Feb 2026 42.0k
View details →