Skip to content

Open WebUI

Open WebUI is an extensible, feature-rich and user-friendly self-hosted AI platform designed to operate entirely offline. It supports various LLM runners like Ollama and OpenAI-compatible APIs, with built-in inference engine for RAG, making it a powerful AI deployment solution.

References


Make directory

Terminal window
mkdir -p {{{DOCKER_PATH_VAR}}}/open-webui && cd {{{DOCKER_PATH_VAR}}}/open-webui

docker-compose.yml

Terminal window
nano docker-compose.yml
docker-compose.yml
services:
open-webui:
image: ghcr.io/open-webui/open-webui:main
container_name: open-webui
ports:
- 3007:8080
extra_hosts:
- host.docker.internal:host-gateway
volumes:
- ./data:/app/backend/data
restart: always
volumes:
open-webui:
external: true
name: open-webui

Start container

Terminal window
docker compose up -d

Open web ui

http://localhost:3007 or http://{{{IP_ADDRESS_VAR}}}:3007