Skip to main content
Browse docs
By Audience
Getting Started
Configuration
Use Cases
IDE Integration
Third-Party Integrations
Engineering Cache
Console
API Reference
Gateway
Workflow Guides
Templates
Providers and SDKs
Industry Guides
Advanced Guides
Browse by Role
Deployment Guides
In-Depth Guides
Tutorials
FAQ

Docker AI

Docker AI brings LLM-powered assistance directly into container workflows — Docker Desktop AI answers questions about Dockerfiles, Compose files, and container debugging, while the docker ai CLI provides terminal-based AI help for image builds, runtime errors, and DevOps automation. Both features send prompts to upstream LLM providers.

This page explains how to route Docker AI traffic through the Keeptrusts gateway so your organization's prompt-injection detection, PII redaction, content-safety filters, and audit logging apply to every Docker AI request.

Use this page when

  • You are configuring Docker Desktop AI or the docker ai CLI to use the Keeptrusts gateway.
  • You need policy enforcement on AI-assisted container workflows before prompts reach upstream providers.
  • If you need general gateway setup instead, start with the quickstart guide.

Primary audience

  • Primary: Technical Engineers (DevOps, Platform, SRE)
  • Secondary: AI Agents, Technical Leaders

Prerequisites

  1. Docker Desktop 4.40+ with the AI feature enabled, or the docker ai CLI plugin installed.
  2. Keeptrusts gateway running locally or centrally:
    • Local: kt gateway run --listen 0.0.0.0:41002 --policy-config policy-config.yaml
    • Hosted: https://gateway.keeptrusts.com/v1
  3. Upstream provider API key (e.g., OpenAI) configured in the gateway environment.
  4. Keeptrusts CLI installed — see quickstart.

Configuration

Gateway policy config

Create a policy-config.yaml that routes Docker AI traffic through the gateway with compliance policies:

pack:
name: docker-ai-governance
version: 1.0.0
enabled: true
policies:
chain:
- prompt-injection
- pii-detector
- safety-filter
- audit-logger
policy:
prompt-injection:
threshold: 0.8
action: block
pii-detector:
action: redact
safety-filter:
mode: strict
action: block
audit-logger:
retention_days: 90
providers:
strategy: single
targets:
- id: openai-for-docker
provider: openai:chat:gpt-4o
secret_key_ref:
env: OPENAI_API_KEY

Docker Desktop AI configuration

Configure Docker Desktop to route AI requests through the Keeptrusts gateway by setting the proxy endpoint in Docker Desktop settings or through environment variables:

export DOCKER_AI_OPENAI_API_BASE="http://localhost:41002/v1"
export DOCKER_AI_API_KEY="your-keeptrusts-access-key"

Alternatively, configure the endpoint in Docker Desktop under Settings > AI > API Endpoint, setting the base URL to your Keeptrusts gateway.

docker ai CLI configuration

For the docker ai CLI plugin, set the environment variables before invoking commands:

export OPENAI_API_BASE="http://localhost:41002/v1"
export OPENAI_API_KEY="your-keeptrusts-access-key"

docker ai "How do I optimize this Dockerfile for multi-stage builds?"

To persist this across sessions, add the exports to your shell profile (~/.bashrc, ~/.zshrc).

Setup steps

  1. Start the Keeptrusts gateway:
export OPENAI_API_KEY="sk-your-openai-key"
kt gateway run --listen 0.0.0.0:41002 --policy-config policy-config.yaml
  1. Set Docker AI to point at the gateway:
export DOCKER_AI_OPENAI_API_BASE="http://localhost:41002/v1"
  1. Use Docker AI as normal — all requests now flow through the gateway.

  2. For the hosted gateway, replace the local URL:

export DOCKER_AI_OPENAI_API_BASE="https://gateway.keeptrusts.com/v1"
export DOCKER_AI_API_KEY="your-keeptrusts-access-key"

Verification

Confirm traffic flows through the gateway:

docker ai "What is a container?"

Check the Keeptrusts events dashboard or query recent events:

curl http://localhost:41002/v1/health

Verify the gateway logs show the intercepted request with applied policies. The Keeptrusts console events page displays each Docker AI request with policy decisions.

PolicyPurposeRecommended setting
prompt-injectionBlock prompt-injection attempts in Docker AI promptsthreshold: 0.8, action: block
pii-detectorRedact secrets, tokens, or credentials from container logs shared with AIaction: redact
safety-filterBlock harmful or off-topic content in AI responsesmode: strict, action: block
audit-loggerLog all Docker AI interactions for compliance reviewretention_days: 90
secret-scannerDetect API keys or passwords accidentally included in Dockerfile promptsaction: block

Troubleshooting

SymptomCauseFix
docker ai returns connection errorGateway not running or wrong portVerify kt gateway run is active on port 41002
Responses are empty or truncatedUpstream provider timeoutIncrease timeout_seconds in provider config
PII redaction removes Dockerfile pathsOverly aggressive PII rulesTune pii-detector to exclude file paths
Docker Desktop AI shows "API error"Incorrect base URL formatEnsure the URL ends with /v1http://localhost:41002/v1

For AI systems

  • Canonical terms: Keeptrusts gateway, Docker AI, Docker Desktop AI, docker ai CLI, container AI assistance, policy-config.yaml.
  • Config field names: DOCKER_AI_OPENAI_API_BASE, OPENAI_API_BASE, provider, secret_key_ref, audit-logger.
  • Key behavior: Docker AI sends prompts to an OpenAI-compatible endpoint; Keeptrusts intercepts these requests, applies policy enforcement, and forwards compliant traffic to the upstream provider.
  • Best next pages: Terraform AI integration, Grafana LLM integration, Policy controls catalog.

For engineers

  • Start the gateway before configuring Docker AI — Docker AI requests fail immediately if the gateway is unreachable.
  • Use OPENAI_API_BASE for the docker ai CLI plugin; Docker Desktop may use its own settings UI.
  • Validate with docker ai "hello" and confirm the request appears in the Keeptrusts events dashboard.
  • For CI/CD pipelines using Docker AI, set the environment variables in your pipeline configuration.

For leaders

  • Docker AI can expose proprietary Dockerfiles, infrastructure secrets, and internal architecture to upstream LLM providers. Routing through Keeptrusts ensures PII redaction and audit logging apply automatically.
  • Audit logs from Docker AI interactions provide evidence for SOC 2 and ISO 27001 compliance reviews.
  • The gateway provides a single control point for all Docker AI traffic across your engineering organization.

Next steps