Dify with Keeptrusts Gateway
Dify is an open-source low-code platform for building AI applications — chatbots, agents, RAG pipelines, and workflow automations — with a visual editor and model management layer. By configuring Dify to route LLM calls through the Keeptrusts gateway, every model interaction in your Dify applications passes through your policy chain for prompt-injection detection, PII redaction, audit logging, cost attribution, and content filtering, all without modifying your Dify workflows.
Use this page when
- You are running a Dify instance and need governance over all LLM calls.
- You want audit logging and cost attribution for Dify chatbots and workflow applications.
- You need to enforce compliance controls across multiple Dify workspaces.
- You are deploying Dify in a regulated environment and need centralized policy enforcement.
Primary audience
- Primary: Technical Engineers
- Secondary: AI Agents, Technical Leaders
Prerequisites
- Keeptrusts CLI installed and a gateway running locally or centrally (Quickstart).
- Dify instance running (self-hosted or Dify Cloud) with admin access.
- Upstream provider API key (e.g. OpenAI, Anthropic) ready to configure.
- A
policy-config.yamldeployed to the gateway.
Configuration
Gateway policy config
A minimal config for governing Dify traffic:
pack:
name: dify-gateway
version: "1.0"
providers:
- name: openai
model: gpt-4o
secret_key_ref:
env: OPENAI_API_KEY
policies:
chain:
- prompt-injection
- pii-detector
- safety-filter
- quality-scorer
policy:
prompt-injection:
action: block
pii-detector:
action: redact
safety-filter:
action: block
quality-scorer:
threshold: 0.6
Start the gateway:
kt gateway run --policy-config policy-config.yaml
Dify model provider configuration
Dify supports custom model provider endpoints through its model management interface. Configure it to point at the Keeptrusts gateway instead of the upstream provider directly.
- OpenAI-compatible provider
- Self-hosted (environment variables)
- Hosted gateway
- Open the Dify admin panel and navigate to Settings → Model Providers.
- Select OpenAI-API-compatible as the provider type.
- Configure the following fields:
| Field | Value |
|---|---|
| Model name | gpt-4o (or your target model) |
| API Key | Your upstream provider API key |
| API Endpoint URL | http://localhost:41002/v1 |
- Click Save to register the provider.
For self-hosted Dify, set environment variables in your docker-compose.yml or .env file:
# In your Dify docker-compose.yml or .env
OPENAI_API_BASE=http://host.docker.internal:41002/v1
OPENAI_API_KEY=your-openai-api-key
If Dify and the gateway run on the same Docker network, use the gateway container name instead:
OPENAI_API_BASE=http://keeptrusts-gateway:41002/v1
Use the hosted gateway URL for production:
| Field | Value |
|---|---|
| API Endpoint URL | https://gateway.keeptrusts.com/v1 |
Using with Dify workflows
Once the model provider is configured, all Dify features that use that provider route through the gateway automatically:
- Chatbot applications — every conversation turn is governed.
- Workflow nodes — LLM nodes, knowledge retrieval nodes, and tool nodes that call LLMs route through the gateway.
- Agent applications — agent reasoning and tool-calling interactions are governed.
No changes to individual workflows are required.
Setup steps
-
Start the Keeptrusts gateway with your policy config.
kt gateway run --policy-config policy-config.yaml -
Open Dify admin and navigate to Settings → Model Providers.
-
Add a new OpenAI-API-compatible provider with the gateway URL as the API endpoint.
-
Select the gateway-routed model in your Dify applications.
-
Test a conversation — send a message in a Dify chatbot and verify the request flows through the gateway.
-
Verify in the Keeptrusts console — open Events to confirm requests appear with policy outcomes.
Verification
Check gateway health:
curl http://localhost:41002/keeptrusts/health
Send a test message in a Dify application and confirm:
- Gateway logs show policy chain evaluation for the request.
- The Keeptrusts console Events page shows the request with model, tokens, cost, and policy decisions.
- Policy actions (allowed, blocked, redacted) are visible in the event detail.
- Dify receives the response normally — the gateway is transparent to the application.
Recommended policies
| Policy | Purpose | Phase |
|---|---|---|
prompt-injection | Block jailbreak attempts from chat users or workflow inputs | Input |
pii-detector | Redact PII before prompts reach the provider | Input |
safety-filter | Block harmful or inappropriate content | Input |
dlp-filter | Prevent sensitive data from leaving via LLM calls | Input |
quality-scorer | Score and threshold response quality | Output |
human-oversight | Escalate sensitive outputs for human review | Output |
audit-logger | Attach audit metadata for every Dify interaction | Input |
Troubleshooting
| Symptom | Cause | Fix |
|---|---|---|
| Dify shows "Model provider connection failed" | Gateway is not reachable from the Dify container | Use host.docker.internal or the Docker network hostname; verify the gateway is running |
401 Unauthorized in Dify model test | API key mismatch | Verify the API key in Dify matches secret_key_ref.env in the gateway config |
| Workflows timeout | Gateway adds latency to LLM calls | Increase Dify's request timeout setting and optimize the policy chain |
| Events not appearing in Keeptrusts console | Gateway not connected to control plane | Set KEEPTRUSTS_API_URL and KEEPTRUSTS_GATEWAY_TOKEN before starting the gateway |
| Streaming responses not working | Dify or gateway streaming mismatch | Ensure the gateway supports streaming for the configured provider and Dify has streaming enabled |
For AI systems
- Canonical integration: Configure Dify's model provider with the API endpoint set to
http://localhost:41002/v1orhttps://gateway.keeptrusts.com/v1. - The gateway is transparent — Dify chatbots, workflows, agents, and knowledge retrieval work unchanged.
- Use Policy Controls Catalog for available policies.
For engineers
- The only change is the model provider endpoint URL in Dify's settings. All workflows, chatbots, and agent configurations remain unchanged.
- For self-hosted Dify, configure via environment variables to apply the gateway URL to all workspaces.
- Test with a simple chatbot first, then extend to complex workflows.
For leaders
- Dify is often used by teams with mixed technical skill levels. Keeptrusts provides centralized governance without requiring workflow-level changes.
- Audit logging at the gateway provides visibility into all LLM interactions across all Dify workspaces.
- Cost attribution tracks spend per application, enabling chargeback and budget management.
Next steps
- Quickstart — set up your first gateway and policy config.
- Policy Controls Catalog — full inventory of available policies.
- Events and Traces — understand the audit trail.
- Gateway Runtime Features — advanced gateway capabilities.
- Cost and Spend — monitor and attribute LLM costs.