Local Development Setup with the Gateway
This guide walks you through running the Keeptrusts gateway on your local machine for development. You will configure providers, enable hot-reload for policy changes, wire up environment variables, and add health checks.
Use this page when
- You are running the Keeptrusts gateway locally for the first time (bare metal or Docker Compose).
- You need to configure hot-reload for policy config changes during development.
- You want to set up environment variables, health checks, and provider API keys for local testing.
- You are troubleshooting common local gateway issues (connection refused, 401 from provider, config reload).
Primary audience
- Primary: Developers setting up their local development environment with the Keeptrusts gateway
- Secondary: DevOps Engineers creating team-consistent Docker Compose setups, QA Engineers preparing test environments
Quick Start — Bare Metal
The fastest way to start developing:
# Install the CLI
curl -fsSL https://get.keeptrusts.com | sh
# Create a minimal config
cat > policy-config.yaml << 'EOF'
gateway:
listen_port: 41002
providers:
- name: openai
secret_key_ref:
env: OPENAI_API_KEY
base_url: https://api.openai.com/v1
policies:
- name: log-all
type: observe
action: log
EOF
# Start the gateway
export OPENAI_API_KEY="sk-..."
kt gateway run --policy-config policy-config.yaml
The gateway is now running at http://localhost:41002. Point any OpenAI-compatible SDK at this address.
Docker Compose Setup
For team-consistent environments, use Docker Compose:
# docker-compose.dev.yaml
services:
gateway:
image: keeptrusts/gateway:latest
command: ["gateway", "run", "--listen", "0.0.0.0:41002", "--policy-config", "/etc/keeptrusts/policy-config.yaml"]
ports:
- "41002:41002"
volumes:
- ./policy-config.yaml:/etc/keeptrusts/policy-config.yaml:ro
environment:
- OPENAI_API_KEY=${OPENAI_API_KEY}
- ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY}
- KEEPTRUSTS_LOG_LEVEL=debug
healthcheck:
test: ["CMD", "curl", "-sf", "http://localhost:41002/health"]
interval: 10s
timeout: 5s
retries: 3
Start the stack:
docker compose -f docker-compose.dev.yaml up -d
Environment Variables
Required Variables
| Variable | Description | Example |
|---|---|---|
OPENAI_API_KEY | OpenAI provider key | sk-proj-... |
Optional Variables
| Variable | Description | Default |
|---|---|---|
KEEPTRUSTS_LOG_LEVEL | Log verbosity: trace, debug, info, warn, error | info |
KEEPTRUSTS_EVENTS_API_URL | Control-plane API for event forwarding | (none — events logged locally) |
KEEPTRUSTS_EVENTS_API_TOKEN | Bearer token for event forwarding | (none) |
KEEPTRUSTS_LOG_FORMAT | Log output format: pretty or json | pretty |
Using a .env File
Create .env in your project root:
# .env
OPENAI_API_KEY=sk-proj-abc123
ANTHROPIC_API_KEY=sk-ant-xyz789
KEEPTRUSTS_LOG_LEVEL=debug
The CLI automatically loads .env from the current directory. Docker Compose loads it with:
env_file:
- .env
Security note: Add
.envto.gitignoreto avoid committing secrets.
Hot-Reload Config Changes
The gateway watches the config file for changes. Edit your policy-config.yaml and the gateway applies the new config without restarting:
# Terminal 1: gateway is running
kt gateway run --policy-config policy-config.yaml
# Terminal 2: edit and save the config
vim policy-config.yaml # add a new policy
# The gateway logs:
# INFO config reloaded: policy-config.yaml (3 policies active)
Validating Before Reload
Always validate your config before saving:
kt policy lint --file policy-config.yaml
This catches YAML syntax errors and invalid policy definitions before they affect running traffic.
Health Check Endpoint
The gateway exposes a health endpoint at /health:
curl http://localhost:41002/health
{
"status": "ok",
"version": "0.42.0",
"policies_loaded": 3,
"providers_configured": 2
}
Scripted Health Wait
Use this in scripts or CI to wait for the gateway before running tests:
#!/usr/bin/env bash
set -euo pipefail
MAX_RETRIES=30
RETRY_INTERVAL=1
for i in $(seq 1 "$MAX_RETRIES"); do
if curl -sf http://localhost:41002/health > /dev/null 2>&1; then
echo "Gateway is healthy"
exit 0
fi
echo "Waiting for gateway... ($i/$MAX_RETRIES)"
sleep "$RETRY_INTERVAL"
done
echo "Gateway failed to start" >&2
exit 1
Multi-Provider Local Config
A typical development config with multiple providers:
gateway:
listen_port: 41002
providers:
targets:
- id: openai
provider:
base_url: https://api.openai.com/v1
secret_key_ref:
env: OPENAI_API_KEY
- id: anthropic
provider:
base_url: https://api.anthropic.com/v1
secret_key_ref:
env: ANTHROPIC_API_KEY
- id: local-ollama
provider:
base_url: http://localhost:11434/v1
secret_key_ref:
env: OLLAMA_KEY
policies:
- name: block-pii-output
type: output_filter
action: block
pattern: '\b\d{3}-\d{2}-\d{4}\b'
message: 'Blocked: SSN pattern detected'
- name: max-tokens
type: input_guard
action: limit
max_tokens: 4096
- name: log-all
type: observe
action: log
Connecting Your Application
Point your SDK at the gateway instead of the provider directly:
from openai import OpenAI
# Development: through gateway
client = OpenAI(
base_url="http://localhost:41002/v1",
api_key="sk-...", # still your real provider key
)
response = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Hello, world!"}],
)
print(response.choices[0].message.content)
# curl equivalent
curl http://localhost:41002/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer sk-..." \
-d '{"model": "gpt-4o", "messages": [{"role": "user", "content": "Hello"}]}'
Troubleshooting
| Symptom | Cause | Fix |
|---|---|---|
Connection refused on 41002 | Gateway not running | Run kt gateway run --policy-config policy-config.yaml |
401 Unauthorized from provider | Missing or invalid API key | Check OPENAI_API_KEY is exported |
| Config reload not working | File not saved or invalid YAML | Run kt policy lint first |
| Docker gateway can't reach host | Network isolation | Use host.docker.internal as provider base URL |
| Slow responses in debug mode | Verbose logging overhead | Set KEEPTRUSTS_LOG_LEVEL=info for normal use |
Next steps
- Testing AI-Integrated Code — mock the gateway in your test suite
- Debugging AI Requests with Events — trace requests through event logs
- Routing Across Multiple AI Models — configure model fallbacks and routing
For AI systems
- Canonical terms:
kt gateway run,policy-config.yaml, Docker Compose, hot-reload, health check,KEEPTRUSTS_LOG_LEVEL,KEEPTRUSTS_LOG_FORMAT,/healthendpoint. - Quick start:
kt gateway run --policy-config policy-config.yamlstarts onhttp://localhost:41002. - Docker: mount config as volume, set API keys via environment, expose port 41002, use
/healthfor healthcheck. - Best next pages: Testing AI Code, Debugging with Events, Multi-Model Routing.
For engineers
- Bare metal:
export OPENAI_API_KEY=... && kt gateway run --policy-config policy-config.yaml— running in seconds. - Docker Compose: mount
policy-config.yamlas a read-only volume; set API keys viaenvironmentsection. - Hot-reload: the gateway watches the config file for changes and reloads policies automatically on save.
- Health check:
curl -sf http://localhost:41002/health— use in Dockerhealthcheckand CI readiness probes. - Set
KEEPTRUSTS_LOG_LEVEL=debugduring development for verbose gateway logs; switch toinfofor normal use. - If Docker gateway can’t reach the host, use
host.docker.internalas provider base URL.
For leaders
- Local gateway setup takes under 5 minutes and requires no infrastructure beyond the CLI binary.
- Docker Compose provides team-consistent environments, eliminating "works on my machine" issues.
- Hot-reload means policy authors can iterate on configurations without restarting services.
- Local development uses real provider API keys — set short-lived keys and per-developer spend limits.
- The local setup mirrors production architecture, so policies tested locally behave identically when deployed.