Submit
Icon for Open WebUI

Open WebUI

Self-hostable web UI for LLMs from the Open WebUI project. Multi-user chat with generic OIDC authentication, shared workspaces, document RAG, MCP client support, and Python-based Pipelines and Functions for custom extensions.

Screenshot of Open WebUI website

Open WebUI is a self-hostable web interface for large language models, originally built as a frontend for Ollama and now a full multi-user chat platform. It runs as a Docker container and supports direct connections to OpenAI, Anthropic, Ollama, and any OpenAI-compatible endpoint, with authentication, workspaces, RAG, and MCP tool integration built in.

What it does

Users sign in through a web UI, select a model from the providers the admin has configured, and chat. Conversations are saved per-user with tagging and search. Documents uploaded to a workspace are chunked, embedded, and made available to the LLM through RAG. Models can be grouped into shared catalogs with per-role access controls.

Authentication covers email-password, OAuth (GitHub, Google, Microsoft), and generic OIDC for providers like Keycloak, Zitadel, or Authentik. Workspaces let teams share prompts, models, and tools across groups of users. The MCP client lets a conversation invoke tools from registered MCP servers inline.

Extension model

Two extension mechanisms ship with the product:

  • Pipelines — a separate service that intercepts requests and responses, letting operators add custom preprocessing, logging, filtering, or routing logic in Python.
  • Functions — lightweight custom model-like endpoints defined in Python directly in the admin UI, useful for exposing internal APIs or transformations as selectable "models" in the chat interface.

A plugin marketplace and tool marketplace provide community-built extensions.

Deployment

Single Docker container backed by SQLite for small installs or Postgres for multi-user production. Runs on Docker, Docker Swarm, and Kubernetes. Optional Redis for rate limiting and background jobs. RAG uses ChromaDB by default with options to swap in Qdrant, Milvus, or pgvector.

Licensing

BSD-3-Clause. Open WebUI Enterprise is a paid support and consulting offering from the project; feature development happens in the open-source repository.

Limitations

  • Provider, model, and pipeline configuration is stored in the database and managed through the admin UI or API, not in version-controlled config files.
  • The Pipelines extension point runs as a separate service from the main Open WebUI container — teams using Pipelines operate two services instead of one.
  • Streaming responses require long-lived WebSocket connections; reverse proxies in front of Open WebUI need to be configured for WebSocket timeouts and connection upgrades.
  • Major version upgrades occasionally require database migrations; release notes document the steps.
  • RAG depends on an embeddings model that must be configured separately (local via Ollama, or an API-based embedding provider).

Share:

Kind
Software
Vendor
Open WebUI
License
Open Source
Website
openwebui.com
Deployment TypeProtocol
Show all
Active
Ad
Icon

 

  
 

Similar to Open WebUI

Icon

 

  
  
Icon

 

  
  
Icon