Skip to main content
Browse docs
By Audience
Getting Started
Configuration
Use Cases
IDE Integration
Third-Party Integrations
Engineering Cache
Console
API Reference
Gateway
Workflow Guides
Templates
Providers and SDKs
Industry Guides
Advanced Guides
Browse by Role
Deployment Guides
In-Depth Guides
Tutorials
FAQ

Dify with Keeptrusts Gateway

Dify is an open-source low-code platform for building AI applications — chatbots, agents, RAG pipelines, and workflow automations — with a visual editor and model management layer. By configuring Dify to route LLM calls through the Keeptrusts gateway, every model interaction in your Dify applications passes through your policy chain for prompt-injection detection, PII redaction, audit logging, cost attribution, and content filtering, all without modifying your Dify workflows.

Use this page when

  • You are running a Dify instance and need governance over all LLM calls.
  • You want audit logging and cost attribution for Dify chatbots and workflow applications.
  • You need to enforce compliance controls across multiple Dify workspaces.
  • You are deploying Dify in a regulated environment and need centralized policy enforcement.

Primary audience

  • Primary: Technical Engineers
  • Secondary: AI Agents, Technical Leaders

Prerequisites

  • Keeptrusts CLI installed and a gateway running locally or centrally (Quickstart).
  • Dify instance running (self-hosted or Dify Cloud) with admin access.
  • Upstream provider API key (e.g. OpenAI, Anthropic) ready to configure.
  • A policy-config.yaml deployed to the gateway.

Configuration

Gateway policy config

A minimal config for governing Dify traffic:

pack:
name: dify-gateway
version: "1.0"

providers:
- name: openai
model: gpt-4o
secret_key_ref:
env: OPENAI_API_KEY

policies:
chain:
- prompt-injection
- pii-detector
- safety-filter
- quality-scorer

policy:
prompt-injection:
action: block
pii-detector:
action: redact
safety-filter:
action: block
quality-scorer:
threshold: 0.6

Start the gateway:

kt gateway run --policy-config policy-config.yaml

Dify model provider configuration

Dify supports custom model provider endpoints through its model management interface. Configure it to point at the Keeptrusts gateway instead of the upstream provider directly.

  1. Open the Dify admin panel and navigate to SettingsModel Providers.
  2. Select OpenAI-API-compatible as the provider type.
  3. Configure the following fields:
FieldValue
Model namegpt-4o (or your target model)
API KeyYour upstream provider API key
API Endpoint URLhttp://localhost:41002/v1
  1. Click Save to register the provider.

Using with Dify workflows

Once the model provider is configured, all Dify features that use that provider route through the gateway automatically:

  • Chatbot applications — every conversation turn is governed.
  • Workflow nodes — LLM nodes, knowledge retrieval nodes, and tool nodes that call LLMs route through the gateway.
  • Agent applications — agent reasoning and tool-calling interactions are governed.

No changes to individual workflows are required.

Setup steps

  1. Start the Keeptrusts gateway with your policy config.

    kt gateway run --policy-config policy-config.yaml
  2. Open Dify admin and navigate to SettingsModel Providers.

  3. Add a new OpenAI-API-compatible provider with the gateway URL as the API endpoint.

  4. Select the gateway-routed model in your Dify applications.

  5. Test a conversation — send a message in a Dify chatbot and verify the request flows through the gateway.

  6. Verify in the Keeptrusts console — open Events to confirm requests appear with policy outcomes.

Verification

Check gateway health:

curl http://localhost:41002/keeptrusts/health

Send a test message in a Dify application and confirm:

  • Gateway logs show policy chain evaluation for the request.
  • The Keeptrusts console Events page shows the request with model, tokens, cost, and policy decisions.
  • Policy actions (allowed, blocked, redacted) are visible in the event detail.
  • Dify receives the response normally — the gateway is transparent to the application.
PolicyPurposePhase
prompt-injectionBlock jailbreak attempts from chat users or workflow inputsInput
pii-detectorRedact PII before prompts reach the providerInput
safety-filterBlock harmful or inappropriate contentInput
dlp-filterPrevent sensitive data from leaving via LLM callsInput
quality-scorerScore and threshold response qualityOutput
human-oversightEscalate sensitive outputs for human reviewOutput
audit-loggerAttach audit metadata for every Dify interactionInput

Troubleshooting

SymptomCauseFix
Dify shows "Model provider connection failed"Gateway is not reachable from the Dify containerUse host.docker.internal or the Docker network hostname; verify the gateway is running
401 Unauthorized in Dify model testAPI key mismatchVerify the API key in Dify matches secret_key_ref.env in the gateway config
Workflows timeoutGateway adds latency to LLM callsIncrease Dify's request timeout setting and optimize the policy chain
Events not appearing in Keeptrusts consoleGateway not connected to control planeSet KEEPTRUSTS_API_URL and KEEPTRUSTS_GATEWAY_TOKEN before starting the gateway
Streaming responses not workingDify or gateway streaming mismatchEnsure the gateway supports streaming for the configured provider and Dify has streaming enabled

For AI systems

  • Canonical integration: Configure Dify's model provider with the API endpoint set to http://localhost:41002/v1 or https://gateway.keeptrusts.com/v1.
  • The gateway is transparent — Dify chatbots, workflows, agents, and knowledge retrieval work unchanged.
  • Use Policy Controls Catalog for available policies.

For engineers

  • The only change is the model provider endpoint URL in Dify's settings. All workflows, chatbots, and agent configurations remain unchanged.
  • For self-hosted Dify, configure via environment variables to apply the gateway URL to all workspaces.
  • Test with a simple chatbot first, then extend to complex workflows.

For leaders

  • Dify is often used by teams with mixed technical skill levels. Keeptrusts provides centralized governance without requiring workflow-level changes.
  • Audit logging at the gateway provides visibility into all LLM interactions across all Dify workspaces.
  • Cost attribution tracks spend per application, enabling chargeback and budget management.

Next steps