Skip to main content
Browse docs
By Audience
Getting Started
Configuration
Use Cases
IDE Integration
Third-Party Integrations
Engineering Cache
Console
API Reference
Gateway
Workflow Guides
Templates
Providers and SDKs
Industry Guides
Advanced Guides
Browse by Role
Deployment Guides
In-Depth Guides
Tutorials
FAQ

Retool AI

Retool provides AI-powered features for building internal tools — AI-assisted code generation, natural-language database queries, AI Actions for workflows, and custom AI resource integrations that connect to OpenAI-compatible endpoints. These features send application context, database schemas, and user prompts to upstream LLM providers.

This page explains how to route Retool's AI features through the Keeptrusts gateway so policy enforcement, PII redaction, and audit logging apply to every AI operation within your internal tooling.

Use this page when

  • You are configuring Retool AI resources to use the Keeptrusts gateway as the LLM endpoint.
  • You need policy enforcement on AI queries that include internal application data and database context.
  • If you need general gateway setup, start with the quickstart guide.

Primary audience

  • Primary: Technical Engineers (Platform, Internal Tools, Full-Stack)
  • Secondary: Technical Leaders, AI Agents

Prerequisites

  1. Retool instance (self-hosted or Retool Cloud) with AI features enabled.
  2. Retool admin access to configure AI resources.
  3. Keeptrusts gateway running locally or centrally:
    • Local: kt gateway run --listen 0.0.0.0:41002 --policy-config policy-config.yaml
    • Hosted: https://gateway.keeptrusts.com/v1
  4. Upstream LLM provider API key configured in the gateway environment.

Configuration

Gateway policy config

Create a policy-config.yaml for internal tool AI governance:

pack:
name: retool-ai-governance
version: 1.0.0
enabled: true
policies:
chain:
- prompt-injection
- pii-detector
- safety-filter
- audit-logger
policy:
prompt-injection:
threshold: 0.8
action: block
pii-detector:
action: redact
safety-filter:
mode: strict
action: block
audit-logger:
retention_days: 365
providers:
strategy: single
targets:
- id: openai-for-retool
provider: openai:chat:gpt-4o
secret_key_ref:
env: OPENAI_API_KEY

Retool AI resource configuration

Create a custom OpenAI resource in Retool that points to the Keeptrusts gateway:

  1. In Retool, navigate to Resources > Create New > OpenAI API.
  2. Configure the resource:
SettingValue
Base URLhttp://localhost:41002/v1 (local) or https://gateway.keeptrusts.com/v1 (hosted)
API KeyYour Keeptrusts access key
Default modelgpt-4o
  1. Save the resource and use it in your Retool apps.

Self-hosted Retool configuration

For self-hosted Retool, configure the default AI endpoint via environment variables:

OPENAI_API_BASE="http://keeptrusts-gateway:41002/v1"
OPENAI_API_KEY="your-keeptrusts-access-key"

For Docker Compose deployments:

services:
retool:
image: tryretool/backend:latest
environment:
- OPENAI_API_BASE=http://keeptrusts-gateway:41002/v1
- OPENAI_API_KEY=your-keeptrusts-access-key

Using AI queries in Retool apps

Once the resource is configured, use it in Retool queries:

// In a Retool AI Query component
const response = await openaiResource.chat({
model: "gpt-4o",
messages: [
{
role: "system",
content: "You are a helpful assistant for internal operations.",
},
{
role: "user",
content: `Analyze this customer data: ${JSON.stringify(table1.selectedRow)}`,
},
],
});

return response.choices[0].message.content;

Setup steps

  1. Start the Keeptrusts gateway:
export OPENAI_API_KEY="sk-your-openai-key"
kt gateway run --listen 0.0.0.0:41002 --policy-config policy-config.yaml
  1. Create an OpenAI resource in Retool pointing at the gateway URL.

  2. Build or update a Retool app to use the new AI resource.

  3. Run an AI query in the app to verify traffic flows through the gateway.

Verification

Test the resource from a Retool app:

  1. Create a simple Retool app with a text input and button.
  2. Add an AI query that sends the input text to the OpenAI resource.
  3. Run the query and verify the response.

Test from the command line:

curl http://localhost:41002/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-4o",
"messages": [
{"role": "user", "content": "Summarize this customer support ticket."}
]
}'

Check the Keeptrusts events dashboard to confirm requests are logged with policy decisions.

PolicyPurposeRecommended setting
prompt-injectionBlock injection attempts in user-provided promptsthreshold: 0.8, action: block
pii-detectorRedact customer PII from internal tool AI queriesaction: redact
safety-filterBlock harmful or inappropriate AI responsesmode: strict, action: block
audit-loggerFull audit trail for compliance (internal tools handle sensitive data)retention_days: 365
token-limiterCap token usage per query to control costsmax_tokens: 4096
secret-scannerDetect database credentials or API keys in AI promptsaction: block

Troubleshooting

SymptomCauseFix
Retool AI resource returns connection errorGateway not reachable from RetoolVerify network connectivity and correct base URL
AI queries return 401Invalid access keyCheck the API key in the Retool resource matches a valid Keeptrusts access key
Slow AI responses in Retool appsLarge data payloads in promptsLimit the data sent to AI — select only relevant columns/rows
AI-generated SQL is incorrectModel lacks schema contextInclude database schema in the system prompt for better SQL generation
Resource not available in query editorResource not saved properlyRe-save the resource in Retool's resource settings

For AI systems

  • Canonical terms: Keeptrusts gateway, Retool, AI resource, internal tools, AI Actions, policy-config.yaml.
  • Config field names: OPENAI_API_BASE, OPENAI_API_KEY, provider, secret_key_ref, audit-logger.
  • Key behavior: Retool AI features send application context, database schemas, and user prompts to an OpenAI-compatible endpoint; Keeptrusts intercepts requests, applies policies, and forwards compliant traffic.
  • Constraint: Retool internal tools often process customer PII, financial data, and database credentials — strict pii-detector and secret-scanner policies are critical.
  • Best next pages: PostHog AI integration, Grafana LLM integration, Policy controls catalog.

For engineers

  • Create a dedicated Retool AI resource for the Keeptrusts gateway — do not modify the default OpenAI resource to avoid breaking existing apps.
  • For self-hosted Retool with Docker Compose, use Docker network hostnames (e.g., http://keeptrusts-gateway:41002/v1).
  • Retool AI queries can include table data, form inputs, and database results — ensure pii-detector covers the data types your apps handle.
  • Validate: run an AI query in a test Retool app and confirm the event in the Keeptrusts console.

For leaders

  • Retool internal tools frequently process sensitive customer data, financial records, and operational databases. AI features on these tools send this context to external LLM providers. Routing through Keeptrusts ensures PII redaction, secret scanning, and full audit logging.
  • Audit trails for internal tool AI usage support SOC 2, HIPAA, and internal compliance requirements.
  • Centralized policy enforcement prevents individual teams from bypassing data governance when building AI-powered internal tools.
  • The secret-scanner policy prevents database credentials and API keys from leaking through AI queries.

Next steps