Open WebUI
Open WebUI is an open-source, self-hosted, and extensible AI chat interface designed to run completely offline. It provides a user-friendly front end for interacting with local LLMs (like Ollama) or remote OpenAI-compatible APIs. Open WebUI includes advanced features such as RAG (Retrieval-Augmented Generation), multi-user support, custom workflows, and secure, private deployments.
Offline-First
Works fully offline with local LLMs like Ollama or LM Studio.
OpenAI-Compatible API Support
Connect to any backend that exposes an OpenAI-like API.
RAG (Retrieval-Augmented Generation)
Enhance responses using custom document sources or vector stores.
Multi-User with Role-Based Access
Manage users and groups with access controls.
Chat Management
Organize, rename, delete, and export chat sessions.
Markdown, LaTeX, and Code Highlighting
Supports rich formatting for chat content and responses.
File Uploads & Embedding
Upload documents for context-aware LLM interactions.
PWA Support
Installable as a Progressive Web App for mobile and desktop use.
Webhooks and Pipelines
Automate tasks and extend chat workflows with custom logic.