Skip to content

About

Ollama server running inside Android - complete offline AI inference with Go backend, WebView UI, and JNI bridge. No internet, no tracking, full privacy.

Version

v1.0.0

Size

12.6 MB

Stars

6

Forks

0

Last Updated

Language

HTML

Open Issues

0

License

MIT

Related Apps