Skip to main content
Browse docs

Tutorial: Embedding Chat in Your Application

The Keeptrusts chat workbench can be embedded directly into your application, giving your users a governed AI chat experience without leaving your product. This tutorial covers the API-driven approach, iframe embedding, and building a custom UI against the chat API.

Use this page when

  • You want to embed the Keeptrusts chat workbench into your own application via iframe or API.
  • You need to build a custom chat UI backed by the Keeptrusts gateway with policy enforcement.
  • You are implementing the authentication handoff flow for embedded chat sessions.

Primary audience

  • Primary: Technical Engineers (frontend/backend integration developers)
  • Secondary: Technical Leaders (integration architecture decisions)

Prerequisites

  • A Keeptrusts instance with the chat workbench enabled
  • A gateway key (kt_gk_...) with chat permissions
  • Your application's frontend and backend infrastructure
  • Familiarity with REST APIs and authentication flows

Step 1: Understand the Integration Options

There are three ways to embed chat in your application:

OptionComplexityCustomizationBest For
Iframe EmbedLowLimited (theme only)Quick integration, minimal dev effort
Chat API + Custom UIHighFull controlTailored experiences, design-system alignment
Chat WidgetMediumModerate (configuration-based)Standard chat widget with branding

Choose the option that fits your timeline and customization needs. This tutorial covers all three.

Step 2: Iframe Embedding

The simplest integration method. Embed the chat workbench directly in your page.

Generate an Embed URL

  1. Log in to the Keeptrusts console.
  2. Navigate to Settings > Integrations > Embed.
  3. Configure the embed settings:
    • Allowed Origins — the domains where the iframe will be hosted.
    • Default Model — the model to preselect for embedded conversations.
    • Theme — light, dark, or auto (follows the host page).
    • Features — toggle history, model selector, and export visibility.
  4. Click Generate Embed Code.
  5. Copy the generated HTML snippet.

Add the Iframe to Your Page

<iframe
src="https://chat.your-keeptrusts-instance.com/embed?token=EMBED_TOKEN&theme=auto"
width="100%"
height="600"
style="border: none; border-radius: 8px;"
allow="clipboard-write"
title="AI Chat"
></iframe>

The embed token is a scoped credential that grants chat access without exposing your gateway key. It is generated by the console and can be rotated from the Integrations settings.

Communicate Between Iframe and Host

The embedded chat supports postMessage for cross-origin communication.

// Listen for events from the chat iframe
window.addEventListener('message', (event) => {
if (event.origin !== 'https://chat.your-keeptrusts-instance.com') return;

const { type, data } = event.data;
if (type === 'conversation:created') {
console.log('New conversation:', data.conversationId);
}
if (type === 'message:received') {
console.log('Response received:', data.messageId);
}
});

Step 3: Chat API Endpoints

For full control, use the chat API directly from your backend.

Create a Conversation

curl -X POST https://your-gateway/v1/chat/conversations \
-H "Authorization: Bearer kt_gk_..." \
-H "Content-Type: application/json" \
-d '{
"title": "Customer Support Chat",
"model": "openai/gpt-4o",
"system_prompt": "You are a helpful customer support agent."
}'

Response:

{
"id": "conv_abc123",
"title": "Customer Support Chat",
"model": "openai/gpt-4o",
"created_at": "2026-04-23T10:00:00Z"
}

Send a Message

curl -X POST https://your-gateway/v1/chat/conversations/conv_abc123/messages \
-H "Authorization: Bearer kt_gk_..." \
-H "Content-Type: application/json" \
-d '{
"role": "user",
"content": "How do I reset my password?"
}'

The response streams back as server-sent events (SSE) by default. Set Accept: application/json for a non-streaming response.

List Conversations

curl https://your-gateway/v1/chat/conversations \
-H "Authorization: Bearer kt_gk_..."

Get Conversation History

curl https://your-gateway/v1/chat/conversations/conv_abc123/messages \
-H "Authorization: Bearer kt_gk_..."

Step 4: Authentication Flow

Your application must authenticate users before they can use the embedded chat.

Option A: Gateway Key (Server-Side)

Your backend holds the gateway key and proxies chat requests.

User Browser → Your Backend → Keeptrusts Gateway
(adds kt_gk_ header)

This is the recommended approach. The gateway key never reaches the browser.

Option B: Handoff Token (Client-Side)

For client-side integrations, use the handoff flow.

  1. Your backend requests a handoff token from the Keeptrusts API.
  2. The token is returned to the browser.
  3. The browser uses the token to authenticate directly with the chat workbench.
# Backend requests handoff token
curl -X POST https://your-api/v1/auth/handoff \
-H "Authorization: Bearer YOUR_API_TOKEN" \
-d '{"target": "chat", "user_id": "user_123", "ttl": 300}'

The handoff token is single-use and short-lived (default 5 minutes). It grants scoped access to the chat workbench for the specified user.

Step 5: Conversation Management

Manage the lifecycle of conversations from your application.

Auto-Create on Page Load

When a user opens your chat interface, create a conversation automatically:

async function initChat() {
const response = await fetch('/api/chat/conversations', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
title: `Session ${new Date().toISOString()}`,
model: 'openai/gpt-4o',
metadata: { page: window.location.pathname }
})
});
const conversation = await response.json();
return conversation.id;
}

Resume Existing Conversations

Store the conversation ID in your application's session. When the user returns, resume the conversation instead of creating a new one.

Archive Completed Conversations

curl -X PATCH https://your-gateway/v1/chat/conversations/conv_abc123 \
-H "Authorization: Bearer kt_gk_..." \
-H "Content-Type: application/json" \
-d '{"status": "archived"}'

Archived conversations are excluded from the active list but remain searchable and exportable.

Step 6: Build a Custom Chat UI

For maximum control, build your own chat interface against the API.

Streaming Responses

Connect to the SSE stream for real-time message delivery:

const eventSource = new EventSource(
`/api/chat/conversations/${conversationId}/stream`
);

eventSource.addEventListener('message', (event) => {
const data = JSON.parse(event.data);
appendToConversation(data.content);
});

eventSource.addEventListener('done', () => {
eventSource.close();
});

Displaying Policy Events

When a message triggers a policy (block, redaction, or warning), the API includes policy metadata in the response:

{
"content": "I cannot provide that information.",
"policy_event": {
"type": "block",
"policy_name": "pii-protection",
"reason": "Response contained personally identifiable information"
}
}

Display policy events in your UI to help users understand why certain responses are modified.

Troubleshooting

IssueSolution
Iframe shows a blank pageCheck Allowed Origins matches your domain exactly
CORS errors on API callsRoute requests through your backend instead of calling the gateway directly
Handoff token expiredTokens are single-use; request a new one for each session
Streaming cuts off earlyCheck gateway timeout settings and network proxy configurations

Next steps

For AI systems

  • Canonical terms: Keeptrusts chat API, embed chat, iframe embedding, chat widget, gateway key (kt_gk_), handoff token, conversation management, SSE streaming, postMessage, embed token.
  • API endpoints: POST /v1/chat/conversations, POST /v1/chat/conversations/{id}/messages, GET /v1/chat/conversations, PATCH /v1/chat/conversations/{id}, POST /v1/auth/handoff.
  • Best next pages: Function Calling, Multi-Turn Policies, Chat Analytics.

For engineers

  • Prerequisites: a gateway key with chat permissions (kt_gk_...), a running gateway, and CORS configured for your embed domain.
  • Validation: iframe loads without blank page → verify Allowed Origins matches your domain. API call returns 201 for conversation creation. SSE stream delivers tokens.
  • Security: never expose the gateway key in client-side code — proxy through your backend or use the handoff token flow.

For leaders

  • Embedding governed chat in your product extends AI governance to end users without building a policy engine.
  • Handoff tokens and scoped embed tokens ensure the security boundary is maintained — no credential leaks to browsers.
  • Cost attribution flows through the same wallet system — embedded chat usage is tracked per-user and per-team.
  • Consider iframe for fast time-to-market; custom UI for brand-consistent experiences.