Willow Inference Server
Open Source Apache-2.0Open source, local, and self-hosted highly optimized language inference server supporting ASR/STT, TTS,…
Difficulty
●● Medium Last check
2026-02-12 Description
Open source, local, and self-hosted highly optimized language inference server supporting ASR/STT, TTS, and LLM across WebRTC, REST, and WS
Preview
Platforms
Linux Docker
Replaces the following proprietary tools
ChatGPT API → Willow Inference Server
Midjourney → Willow Inference Server
DALL-E → Willow Inference Server
Self-Hosting
Willow Inference Server can be self-hosted on your own infrastructure. Visit the official documentation for installation instructions.
Documentation →Categories
Tags
#cuda
#deep-learning
#llama
#llm
#privacy
#speech-recognition
#speech-to-text
Similar Tools
Ollama
Run large language models locally with ease
105.0k
↳ ChatGPT Midjourney +1
MIT ● Easy
LocalAI
Local OpenAI-compatible API for various AI models
26.0k
↳ ChatGPT Midjourney +1
MIT ●● Medium
Stable Diffusion WebUI
Powerful user interface for AI image generation
145.0k
↳ ChatGPT Midjourney +1
AGPL-3.0 ●● Medium