Open WebUI
Open WebUI (formerly Ollama WebUI) is a self-hosted, extensible web interface for interacting with LLMs. It supports OpenAI-compatible endpoints, making it straightforward to route all LLM traffic through the Keeptrusts gateway. This gives you policy enforcement, PII redaction, content filtering, audit logging, and cost attribution for every conversation in your Open WebUI deployment.
Use this page when
- You are deploying Open WebUI and need governance over all LLM interactions.
- You want to route Open WebUI's LLM traffic through the Keeptrusts gateway.
- If you need Ollama-specific local model configuration, see your Ollama documentation alongside this guide.
Primary audience
- Primary: Technical Engineers
- Secondary: AI Agents, Technical Leaders
Prerequisites
- Docker and Docker Compose installed
- Keeptrusts CLI (
kt) installed and on yourPATH - An LLM provider API key (OpenAI, Anthropic, or compatible) or a local Ollama instance
Configuration
Gateway policy config
pack:
name: open-webui-gateway
version: 1.0.0
enabled: true
providers:
targets:
- id: openai-webui
provider: openai:chat:gpt-4o
secret_key_ref:
env: OPENAI_API_KEY
policies:
chain:
- prompt-injection
- pii-detector
- content-filter
- audit-logger
policy:
prompt-injection:
threshold: 0.8
action: block
pii-detector:
action: redact
entities:
- EMAIL
- PHONE
- SSN
- CREDIT_CARD
content-filter:
action: block
categories:
- restricted-topics
audit-logger:
immutable: true
retention_days: 365
log_all_access: true
Multi-provider config with fallback
pack:
name: open-webui-multi
version: 1.0.0
enabled: true
providers:
targets:
- id: openai-primary
provider: openai:chat:gpt-4o
secret_key_ref:
env: OPENAI_API_KEY
- id: anthropic-fallback
provider: anthropic:chat:claude-sonnet-4-20250514
secret_key_ref:
env: ANTHROPIC_API_KEY
policies:
chain:
- pii-detector
- audit-logger
policy:
pii-detector:
action: redact
entities:
- EMAIL
- PHONE
audit-logger:
immutable: true
retention_days: 365
log_all_access: true
Setup Steps
Docker Compose deployment
The recommended deployment runs Open WebUI and the Keeptrusts gateway together:
# docker-compose.yml
services:
keeptrusts-gateway:
image: keeptrusts/gateway:latest
ports:
- "41002:41002"
volumes:
- ./policy-config.yaml:/etc/keeptrusts/policy-config.yaml:ro
environment:
- OPENAI_API_KEY=${OPENAI_API_KEY}
command: >
kt gateway run
--listen 0.0.0.0:41002
--policy-config /etc/keeptrusts/policy-config.yaml
open-webui:
image: ghcr.io/open-webui/open-webui:main
ports:
- "3000:8080"
environment:
- OPENAI_API_BASE_URL=http://keeptrusts-gateway:41002/v1
- OPENAI_API_KEY=unused
depends_on:
- keeptrusts-gateway
volumes:
- open-webui-data:/app/backend/data
volumes:
open-webui-data:
Manual setup (without Docker Compose)
- Export your API key:
export OPENAI_API_KEY="sk-your-api-key"
- Start the gateway:
kt gateway run --listen 0.0.0.0:41002 --policy-config policy-config.yaml
-
Configure Open WebUI to use the gateway. In Open WebUI's admin settings:
- Navigate to Settings > Connections
- Set OpenAI API Base URL to
http://localhost:41002/v1 - Set API Key to any non-empty value (the gateway handles auth)
-
Start Open WebUI:
docker run -d \
-p 3000:8080 \
-e OPENAI_API_BASE_URL=http://host.docker.internal:41002/v1 \
-e OPENAI_API_KEY=unused \
-v open-webui:/app/backend/data \
ghcr.io/open-webui/open-webui:main
For a hosted gateway, replace the base URL with https://gateway.keeptrusts.com/v1.
With local Ollama models
Route Ollama traffic through the gateway for governance on local model interactions:
# Add to docker-compose.yml
services:
ollama:
image: ollama/ollama:latest
ports:
- "11434:11434"
volumes:
- ollama-data:/root/.ollama
open-webui:
environment:
- OLLAMA_BASE_URL=http://ollama:11434
- OPENAI_API_BASE_URL=http://keeptrusts-gateway:41002/v1
- OPENAI_API_KEY=unused
volumes:
ollama-data:
Verification
- Verify the gateway is running:
curl http://localhost:41002/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-4o",
"messages": [
{"role": "user", "content": "Hello from Open WebUI gateway test."}
]
}'
-
Open the WebUI at
http://localhost:3000and start a conversation. -
Check the Keeptrusts console Events page — every Open WebUI conversation should appear as an audit event.
Recommended Policies
| Policy | Purpose | Recommended setting |
|---|---|---|
prompt-injection | Block adversarial prompts from WebUI users | threshold: 0.8, action: block |
pii-detector | Redact personal data before it reaches the LLM provider | action: redact, entities: EMAIL, PHONE, SSN |
content-filter | Block restricted topics in conversations | action: block, configure categories |
audit-logger | Immutable audit trail for every conversation | immutable: true, retention_days: 365 |
cost-attribution | Track spend per WebUI user | Tag requests with user metadata |
disclaimer | Attach compliance notices to AI responses | Configure per organizational requirement |
Troubleshooting
| Symptom | Cause | Fix |
|---|---|---|
| Open WebUI shows "Connection Error" | Gateway not reachable from the WebUI container | Use Docker service name (keeptrusts-gateway:41002) not localhost |
| No models listed in WebUI | Gateway not returning model list | Verify the gateway is running and the provider key is valid |
Gateway returns 403 | Policy chain blocked the request | Review the policy violation in the console Events page |
| Slow responses | Policy chain processing or network latency | Check policy chain length; use the Docker Compose network for low latency |
| Docker networking issues | localhost resolves differently inside containers | Use host.docker.internal or Docker service names |
For AI systems
- Canonical terms: Keeptrusts gateway, Open WebUI, Ollama WebUI, self-hosted LLM UI, Docker Compose, OpenAI-compatible endpoint, policy-config.yaml.
- Config field names:
OPENAI_API_BASE_URL,OPENAI_API_KEY, gateway--listenaddress. - Key behavior: Open WebUI connects to the Keeptrusts gateway via its OpenAI-compatible endpoint configuration. All conversations are governed by the gateway's policy chain.
- Best next pages: OpenAI integration, Policy controls catalog, Install the gateway.
For engineers
- Start command:
docker compose up -dwith the provideddocker-compose.yml. - Set
OPENAI_API_BASE_URL=http://keeptrusts-gateway:41002/v1in the Open WebUI container. - Use Docker service names (not
localhost) for inter-container communication. - Open WebUI supports both OpenAI-compatible and Ollama backends — both can route through the gateway.
For leaders
- Open WebUI provides a self-hosted alternative to commercial AI chat products — the gateway adds enterprise-grade governance.
- Every conversation is logged with full attribution, meeting compliance requirements for regulated industries.
- Self-hosting plus gateway governance gives you complete control over where data flows — no data leaves your infrastructure unless explicitly configured.
- Multi-provider support means users can switch between models while governance policies apply consistently.
Next steps
- Install the gateway — detailed gateway installation guide
- Policy controls catalog — all available policy types
- OpenAI integration — full OpenAI provider reference
- Quickstart — first governed request tutorial