Skip to main content
Browse docs
By Audience
Getting Started
Configuration
Use Cases
IDE Integration
Third-Party Integrations
Engineering Cache
Console
API Reference
Gateway
Workflow Guides
Templates
Providers and SDKs
Industry Guides
Advanced Guides
Browse by Role
Deployment Guides
In-Depth Guides
Tutorials
FAQ

Open WebUI

Open WebUI (formerly Ollama WebUI) is a self-hosted, extensible web interface for interacting with LLMs. It supports OpenAI-compatible endpoints, making it straightforward to route all LLM traffic through the Keeptrusts gateway. This gives you policy enforcement, PII redaction, content filtering, audit logging, and cost attribution for every conversation in your Open WebUI deployment.

Use this page when

  • You are deploying Open WebUI and need governance over all LLM interactions.
  • You want to route Open WebUI's LLM traffic through the Keeptrusts gateway.
  • If you need Ollama-specific local model configuration, see your Ollama documentation alongside this guide.

Primary audience

  • Primary: Technical Engineers
  • Secondary: AI Agents, Technical Leaders

Prerequisites

  • Docker and Docker Compose installed
  • Keeptrusts CLI (kt) installed and on your PATH
  • An LLM provider API key (OpenAI, Anthropic, or compatible) or a local Ollama instance

Configuration

Gateway policy config

pack:
name: open-webui-gateway
version: 1.0.0
enabled: true
providers:
targets:
- id: openai-webui
provider: openai:chat:gpt-4o
secret_key_ref:
env: OPENAI_API_KEY
policies:
chain:
- prompt-injection
- pii-detector
- content-filter
- audit-logger
policy:
prompt-injection:
threshold: 0.8
action: block
pii-detector:
action: redact
entities:
- EMAIL
- PHONE
- SSN
- CREDIT_CARD
content-filter:
action: block
categories:
- restricted-topics
audit-logger:
immutable: true
retention_days: 365
log_all_access: true

Multi-provider config with fallback

pack:
name: open-webui-multi
version: 1.0.0
enabled: true
providers:
targets:
- id: openai-primary
provider: openai:chat:gpt-4o
secret_key_ref:
env: OPENAI_API_KEY
- id: anthropic-fallback
provider: anthropic:chat:claude-sonnet-4-20250514
secret_key_ref:
env: ANTHROPIC_API_KEY
policies:
chain:
- pii-detector
- audit-logger
policy:
pii-detector:
action: redact
entities:
- EMAIL
- PHONE
audit-logger:
immutable: true
retention_days: 365
log_all_access: true

Setup Steps

Docker Compose deployment

The recommended deployment runs Open WebUI and the Keeptrusts gateway together:

# docker-compose.yml
services:
keeptrusts-gateway:
image: keeptrusts/gateway:latest
ports:
- "41002:41002"
volumes:
- ./policy-config.yaml:/etc/keeptrusts/policy-config.yaml:ro
environment:
- OPENAI_API_KEY=${OPENAI_API_KEY}
command: >
kt gateway run
--listen 0.0.0.0:41002
--policy-config /etc/keeptrusts/policy-config.yaml

open-webui:
image: ghcr.io/open-webui/open-webui:main
ports:
- "3000:8080"
environment:
- OPENAI_API_BASE_URL=http://keeptrusts-gateway:41002/v1
- OPENAI_API_KEY=unused
depends_on:
- keeptrusts-gateway
volumes:
- open-webui-data:/app/backend/data

volumes:
open-webui-data:

Manual setup (without Docker Compose)

  1. Export your API key:
export OPENAI_API_KEY="sk-your-api-key"
  1. Start the gateway:
kt gateway run --listen 0.0.0.0:41002 --policy-config policy-config.yaml
  1. Configure Open WebUI to use the gateway. In Open WebUI's admin settings:

    • Navigate to Settings > Connections
    • Set OpenAI API Base URL to http://localhost:41002/v1
    • Set API Key to any non-empty value (the gateway handles auth)
  2. Start Open WebUI:

docker run -d \
-p 3000:8080 \
-e OPENAI_API_BASE_URL=http://host.docker.internal:41002/v1 \
-e OPENAI_API_KEY=unused \
-v open-webui:/app/backend/data \
ghcr.io/open-webui/open-webui:main

For a hosted gateway, replace the base URL with https://gateway.keeptrusts.com/v1.

With local Ollama models

Route Ollama traffic through the gateway for governance on local model interactions:

# Add to docker-compose.yml
services:
ollama:
image: ollama/ollama:latest
ports:
- "11434:11434"
volumes:
- ollama-data:/root/.ollama

open-webui:
environment:
- OLLAMA_BASE_URL=http://ollama:11434
- OPENAI_API_BASE_URL=http://keeptrusts-gateway:41002/v1
- OPENAI_API_KEY=unused

volumes:
ollama-data:

Verification

  1. Verify the gateway is running:
curl http://localhost:41002/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-4o",
"messages": [
{"role": "user", "content": "Hello from Open WebUI gateway test."}
]
}'
  1. Open the WebUI at http://localhost:3000 and start a conversation.

  2. Check the Keeptrusts console Events page — every Open WebUI conversation should appear as an audit event.

PolicyPurposeRecommended setting
prompt-injectionBlock adversarial prompts from WebUI usersthreshold: 0.8, action: block
pii-detectorRedact personal data before it reaches the LLM provideraction: redact, entities: EMAIL, PHONE, SSN
content-filterBlock restricted topics in conversationsaction: block, configure categories
audit-loggerImmutable audit trail for every conversationimmutable: true, retention_days: 365
cost-attributionTrack spend per WebUI userTag requests with user metadata
disclaimerAttach compliance notices to AI responsesConfigure per organizational requirement

Troubleshooting

SymptomCauseFix
Open WebUI shows "Connection Error"Gateway not reachable from the WebUI containerUse Docker service name (keeptrusts-gateway:41002) not localhost
No models listed in WebUIGateway not returning model listVerify the gateway is running and the provider key is valid
Gateway returns 403Policy chain blocked the requestReview the policy violation in the console Events page
Slow responsesPolicy chain processing or network latencyCheck policy chain length; use the Docker Compose network for low latency
Docker networking issueslocalhost resolves differently inside containersUse host.docker.internal or Docker service names

For AI systems

  • Canonical terms: Keeptrusts gateway, Open WebUI, Ollama WebUI, self-hosted LLM UI, Docker Compose, OpenAI-compatible endpoint, policy-config.yaml.
  • Config field names: OPENAI_API_BASE_URL, OPENAI_API_KEY, gateway --listen address.
  • Key behavior: Open WebUI connects to the Keeptrusts gateway via its OpenAI-compatible endpoint configuration. All conversations are governed by the gateway's policy chain.
  • Best next pages: OpenAI integration, Policy controls catalog, Install the gateway.

For engineers

  • Start command: docker compose up -d with the provided docker-compose.yml.
  • Set OPENAI_API_BASE_URL=http://keeptrusts-gateway:41002/v1 in the Open WebUI container.
  • Use Docker service names (not localhost) for inter-container communication.
  • Open WebUI supports both OpenAI-compatible and Ollama backends — both can route through the gateway.

For leaders

  • Open WebUI provides a self-hosted alternative to commercial AI chat products — the gateway adds enterprise-grade governance.
  • Every conversation is logged with full attribution, meeting compliance requirements for regulated industries.
  • Self-hosting plus gateway governance gives you complete control over where data flows — no data leaves your infrastructure unless explicitly configured.
  • Multi-provider support means users can switch between models while governance policies apply consistently.

Next steps