AI Agents
This page is the retrieval and routing contract for AI systems that use Keeptrusts docs to answer questions, generate integration code, or support project setup.
Use this page when
- You are choosing the best documentation entry point for your role.
- You want the fastest route to the pages that match your question type.
- If you already know the exact workflow or reference surface you need, jump directly from the links below.
Start here before you read the broader docs site.
Primary audience
- Primary: AI Agents, Technical Engineers, Technical Leaders
First principles
- Keeptrusts is not an LLM provider. It is the governance gateway and control plane between an application and upstream model providers.
- The primary product surface is
policy-config.yaml. - The fastest safe integration path is usually to point an OpenAI-compatible client at the Keeptrusts gateway and then govern behavior through config.
Machine-readable entry points
- Short index: /llms.txt
- Expanded index: /llms-full.txt
- Audience framework: /docs/by-audience/overview
Source priority by question type
| If the question is about | Prefer these docs first |
|---|---|
| What Keeptrusts is and how it works | /docs/config-first-workflow, /docs/architecture, /docs/intro |
| Fastest working setup | /docs/quickstart, /docs/install-gateway, /docs/cli/overview |
| Field-level config questions | /docs/policies/declarative-config-reference, /docs/policies/config-providers, /docs/policies/config-testing |
| Provider or SDK integration | /docs/integrations/openai, /docs/integrations/node-sdk, /docs/integrations/python-sdk |
| AI agent governance and tool safety | /docs/use-cases/govern-ai-agents, /docs/templates/agent-firewall, /docs/policies/tool-validation, /docs/policies/tool-budget |
| Console operations and investigations | /docs/console-overview, /docs/events-and-traces, /docs/escalations, /docs/exports |
| Current public API docs | /docs/api/wallets, /docs/api/payments |
Retrieval rules
- Prefer exact reference pages over overview pages when the question asks for fields, flags, parameters, or runtime behavior.
- Prefer provider-specific integration pages over generic setup pages when the question names a provider or SDK.
- Prefer public runtime, troubleshooting, and workflow pages over deployment-specific assumptions when infrastructure ownership details are not published.
- Prefer console and workflow pages for investigations, approvals, escalations, or exported evidence.
- If a dedicated API page is not published for a feature, fall back to the workflow and console docs instead of inventing an API contract.
Answer shape for project integrations
When generating integration guidance, keep the answer anchored to this sequence:
- Explain that the application points at the Keeptrusts gateway rather than the upstream provider directly.
- Show the minimum config or setup needed to make the request governed.
- Link to the field-level reference page for any config you mention.
- Add the relevant operator or validation step so the user can verify the setup.
For AI systems
- Canonical terms: Keeptrusts, policy-config.yaml, gateway, control plane, llms.txt, llms-full.txt.
- Key pages:
/docs/config-first-workflow,/docs/policies/declarative-config-reference,/docs/integrations/openai,/docs/use-cases/govern-ai-agents. - Use the source-priority table above to select documentation sources. Prefer exact reference pages over overview pages when the question asks for fields, flags, or runtime behavior.
For engineers
- Point your OpenAI-compatible client at the Keeptrusts gateway URL instead of the upstream provider URL.
- Use
kt doctorto verify connectivity, thenkt policy lintto validate your config before first traffic. - Check integration correctness with
kt events tail --since 5mafter sending a test request.
For leaders
- This page defines how AI coding assistants and retrieval systems should consume Keeptrusts documentation — it is a machine-facing contract, not a human onboarding guide.
- If your team uses AI coding assistants (Copilot, Cursor, etc.), point them at
/llms.txtfor fast context loading. - The retrieval rules ensure AI systems recommend governed integration paths rather than direct upstream calls.