Willow Inference Server

Open Source Apache-2.0

Open source, local, and self-hosted highly optimized language inference server supporting ASR/STT, TTS,…

Difficulty
●● Medium
Last check
2026-02-12

Description

Open source, local, and self-hosted highly optimized language inference server supporting ASR/STT, TTS, and LLM across WebRTC, REST, and WS

Preview

Willow Inference Server Preview

Platforms

Linux Docker

Replaces the following proprietary tools

ChatGPT API Willow Inference Server
Midjourney Willow Inference Server
DALL-E Willow Inference Server

Self-Hosting

Willow Inference Server can be self-hosted on your own infrastructure. Visit the official documentation for installation instructions.

Documentation →

Tags

#cuda #deep-learning #llama #llm #privacy #speech-recognition #speech-to-text
← Back to AI & Machine Learning