Docker AI
Docker AI brings LLM-powered assistance directly into container workflows — Docker Desktop AI answers questions about Dockerfiles, Compose files, and container debugging, while the docker ai CLI provides terminal-based AI help for image builds, runtime errors, and DevOps automation. Both features send prompts to upstream LLM providers.
This page explains how to route Docker AI traffic through the Keeptrusts gateway so your organization's prompt-injection detection, PII redaction, content-safety filters, and audit logging apply to every Docker AI request.
Use this page when
- You are configuring Docker Desktop AI or the
docker aiCLI to use the Keeptrusts gateway. - You need policy enforcement on AI-assisted container workflows before prompts reach upstream providers.
- If you need general gateway setup instead, start with the quickstart guide.
Primary audience
- Primary: Technical Engineers (DevOps, Platform, SRE)
- Secondary: AI Agents, Technical Leaders
Prerequisites
- Docker Desktop 4.40+ with the AI feature enabled, or the
docker aiCLI plugin installed. - Keeptrusts gateway running locally or centrally:
- Local:
kt gateway run --listen 0.0.0.0:41002 --policy-config policy-config.yaml - Hosted:
https://gateway.keeptrusts.com/v1
- Local:
- Upstream provider API key (e.g., OpenAI) configured in the gateway environment.
- Keeptrusts CLI installed — see quickstart.
Configuration
Gateway policy config
Create a policy-config.yaml that routes Docker AI traffic through the gateway with compliance policies:
pack:
name: docker-ai-governance
version: 1.0.0
enabled: true
policies:
chain:
- prompt-injection
- pii-detector
- safety-filter
- audit-logger
policy:
prompt-injection:
threshold: 0.8
action: block
pii-detector:
action: redact
safety-filter:
mode: strict
action: block
audit-logger:
retention_days: 90
providers:
strategy: single
targets:
- id: openai-for-docker
provider: openai:chat:gpt-4o
secret_key_ref:
env: OPENAI_API_KEY
Docker Desktop AI configuration
Configure Docker Desktop to route AI requests through the Keeptrusts gateway by setting the proxy endpoint in Docker Desktop settings or through environment variables:
export DOCKER_AI_OPENAI_API_BASE="http://localhost:41002/v1"
export DOCKER_AI_API_KEY="your-keeptrusts-access-key"
Alternatively, configure the endpoint in Docker Desktop under Settings > AI > API Endpoint, setting the base URL to your Keeptrusts gateway.
docker ai CLI configuration
For the docker ai CLI plugin, set the environment variables before invoking commands:
export OPENAI_API_BASE="http://localhost:41002/v1"
export OPENAI_API_KEY="your-keeptrusts-access-key"
docker ai "How do I optimize this Dockerfile for multi-stage builds?"
To persist this across sessions, add the exports to your shell profile (~/.bashrc, ~/.zshrc).
Setup steps
- Start the Keeptrusts gateway:
export OPENAI_API_KEY="sk-your-openai-key"
kt gateway run --listen 0.0.0.0:41002 --policy-config policy-config.yaml
- Set Docker AI to point at the gateway:
export DOCKER_AI_OPENAI_API_BASE="http://localhost:41002/v1"
-
Use Docker AI as normal — all requests now flow through the gateway.
-
For the hosted gateway, replace the local URL:
export DOCKER_AI_OPENAI_API_BASE="https://gateway.keeptrusts.com/v1"
export DOCKER_AI_API_KEY="your-keeptrusts-access-key"
Verification
Confirm traffic flows through the gateway:
docker ai "What is a container?"
Check the Keeptrusts events dashboard or query recent events:
curl http://localhost:41002/v1/health
Verify the gateway logs show the intercepted request with applied policies. The Keeptrusts console events page displays each Docker AI request with policy decisions.
Recommended policies
| Policy | Purpose | Recommended setting |
|---|---|---|
prompt-injection | Block prompt-injection attempts in Docker AI prompts | threshold: 0.8, action: block |
pii-detector | Redact secrets, tokens, or credentials from container logs shared with AI | action: redact |
safety-filter | Block harmful or off-topic content in AI responses | mode: strict, action: block |
audit-logger | Log all Docker AI interactions for compliance review | retention_days: 90 |
secret-scanner | Detect API keys or passwords accidentally included in Dockerfile prompts | action: block |
Troubleshooting
| Symptom | Cause | Fix |
|---|---|---|
docker ai returns connection error | Gateway not running or wrong port | Verify kt gateway run is active on port 41002 |
| Responses are empty or truncated | Upstream provider timeout | Increase timeout_seconds in provider config |
| PII redaction removes Dockerfile paths | Overly aggressive PII rules | Tune pii-detector to exclude file paths |
| Docker Desktop AI shows "API error" | Incorrect base URL format | Ensure the URL ends with /v1 — http://localhost:41002/v1 |
For AI systems
- Canonical terms: Keeptrusts gateway, Docker AI, Docker Desktop AI,
docker aiCLI, container AI assistance, policy-config.yaml. - Config field names:
DOCKER_AI_OPENAI_API_BASE,OPENAI_API_BASE,provider,secret_key_ref,audit-logger. - Key behavior: Docker AI sends prompts to an OpenAI-compatible endpoint; Keeptrusts intercepts these requests, applies policy enforcement, and forwards compliant traffic to the upstream provider.
- Best next pages: Terraform AI integration, Grafana LLM integration, Policy controls catalog.
For engineers
- Start the gateway before configuring Docker AI — Docker AI requests fail immediately if the gateway is unreachable.
- Use
OPENAI_API_BASEfor thedocker aiCLI plugin; Docker Desktop may use its own settings UI. - Validate with
docker ai "hello"and confirm the request appears in the Keeptrusts events dashboard. - For CI/CD pipelines using Docker AI, set the environment variables in your pipeline configuration.
For leaders
- Docker AI can expose proprietary Dockerfiles, infrastructure secrets, and internal architecture to upstream LLM providers. Routing through Keeptrusts ensures PII redaction and audit logging apply automatically.
- Audit logs from Docker AI interactions provide evidence for SOC 2 and ISO 27001 compliance reviews.
- The gateway provides a single control point for all Docker AI traffic across your engineering organization.
Next steps
- Terraform AI integration — govern IaC AI assistants
- Grafana LLM integration — govern observability AI features
- Policy controls catalog — full policy reference
- Quickstart — install
ktand run your first gateway - Gateways & Actions — hosted gateway lifecycle and health