Retool AI
Retool provides AI-powered features for building internal tools — AI-assisted code generation, natural-language database queries, AI Actions for workflows, and custom AI resource integrations that connect to OpenAI-compatible endpoints. These features send application context, database schemas, and user prompts to upstream LLM providers.
This page explains how to route Retool's AI features through the Keeptrusts gateway so policy enforcement, PII redaction, and audit logging apply to every AI operation within your internal tooling.
Use this page when
- You are configuring Retool AI resources to use the Keeptrusts gateway as the LLM endpoint.
- You need policy enforcement on AI queries that include internal application data and database context.
- If you need general gateway setup, start with the quickstart guide.
Primary audience
- Primary: Technical Engineers (Platform, Internal Tools, Full-Stack)
- Secondary: Technical Leaders, AI Agents
Prerequisites
- Retool instance (self-hosted or Retool Cloud) with AI features enabled.
- Retool admin access to configure AI resources.
- Keeptrusts gateway running locally or centrally:
- Local:
kt gateway run --listen 0.0.0.0:41002 --policy-config policy-config.yaml - Hosted:
https://gateway.keeptrusts.com/v1
- Local:
- Upstream LLM provider API key configured in the gateway environment.
Configuration
Gateway policy config
Create a policy-config.yaml for internal tool AI governance:
pack:
name: retool-ai-governance
version: 1.0.0
enabled: true
policies:
chain:
- prompt-injection
- pii-detector
- safety-filter
- audit-logger
policy:
prompt-injection:
threshold: 0.8
action: block
pii-detector:
action: redact
safety-filter:
mode: strict
action: block
audit-logger:
retention_days: 365
providers:
strategy: single
targets:
- id: openai-for-retool
provider: openai:chat:gpt-4o
secret_key_ref:
env: OPENAI_API_KEY
Retool AI resource configuration
Create a custom OpenAI resource in Retool that points to the Keeptrusts gateway:
- In Retool, navigate to Resources > Create New > OpenAI API.
- Configure the resource:
| Setting | Value |
|---|---|
| Base URL | http://localhost:41002/v1 (local) or https://gateway.keeptrusts.com/v1 (hosted) |
| API Key | Your Keeptrusts access key |
| Default model | gpt-4o |
- Save the resource and use it in your Retool apps.
Self-hosted Retool configuration
For self-hosted Retool, configure the default AI endpoint via environment variables:
OPENAI_API_BASE="http://keeptrusts-gateway:41002/v1"
OPENAI_API_KEY="your-keeptrusts-access-key"
For Docker Compose deployments:
services:
retool:
image: tryretool/backend:latest
environment:
- OPENAI_API_BASE=http://keeptrusts-gateway:41002/v1
- OPENAI_API_KEY=your-keeptrusts-access-key
Using AI queries in Retool apps
Once the resource is configured, use it in Retool queries:
// In a Retool AI Query component
const response = await openaiResource.chat({
model: "gpt-4o",
messages: [
{
role: "system",
content: "You are a helpful assistant for internal operations.",
},
{
role: "user",
content: `Analyze this customer data: ${JSON.stringify(table1.selectedRow)}`,
},
],
});
return response.choices[0].message.content;
Setup steps
- Start the Keeptrusts gateway:
export OPENAI_API_KEY="sk-your-openai-key"
kt gateway run --listen 0.0.0.0:41002 --policy-config policy-config.yaml
-
Create an OpenAI resource in Retool pointing at the gateway URL.
-
Build or update a Retool app to use the new AI resource.
-
Run an AI query in the app to verify traffic flows through the gateway.
Verification
Test the resource from a Retool app:
- Create a simple Retool app with a text input and button.
- Add an AI query that sends the input text to the OpenAI resource.
- Run the query and verify the response.
Test from the command line:
curl http://localhost:41002/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-4o",
"messages": [
{"role": "user", "content": "Summarize this customer support ticket."}
]
}'
Check the Keeptrusts events dashboard to confirm requests are logged with policy decisions.
Recommended policies
| Policy | Purpose | Recommended setting |
|---|---|---|
prompt-injection | Block injection attempts in user-provided prompts | threshold: 0.8, action: block |
pii-detector | Redact customer PII from internal tool AI queries | action: redact |
safety-filter | Block harmful or inappropriate AI responses | mode: strict, action: block |
audit-logger | Full audit trail for compliance (internal tools handle sensitive data) | retention_days: 365 |
token-limiter | Cap token usage per query to control costs | max_tokens: 4096 |
secret-scanner | Detect database credentials or API keys in AI prompts | action: block |
Troubleshooting
| Symptom | Cause | Fix |
|---|---|---|
| Retool AI resource returns connection error | Gateway not reachable from Retool | Verify network connectivity and correct base URL |
| AI queries return 401 | Invalid access key | Check the API key in the Retool resource matches a valid Keeptrusts access key |
| Slow AI responses in Retool apps | Large data payloads in prompts | Limit the data sent to AI — select only relevant columns/rows |
| AI-generated SQL is incorrect | Model lacks schema context | Include database schema in the system prompt for better SQL generation |
| Resource not available in query editor | Resource not saved properly | Re-save the resource in Retool's resource settings |
For AI systems
- Canonical terms: Keeptrusts gateway, Retool, AI resource, internal tools, AI Actions, policy-config.yaml.
- Config field names:
OPENAI_API_BASE,OPENAI_API_KEY,provider,secret_key_ref,audit-logger. - Key behavior: Retool AI features send application context, database schemas, and user prompts to an OpenAI-compatible endpoint; Keeptrusts intercepts requests, applies policies, and forwards compliant traffic.
- Constraint: Retool internal tools often process customer PII, financial data, and database credentials — strict
pii-detectorandsecret-scannerpolicies are critical. - Best next pages: PostHog AI integration, Grafana LLM integration, Policy controls catalog.
For engineers
- Create a dedicated Retool AI resource for the Keeptrusts gateway — do not modify the default OpenAI resource to avoid breaking existing apps.
- For self-hosted Retool with Docker Compose, use Docker network hostnames (e.g.,
http://keeptrusts-gateway:41002/v1). - Retool AI queries can include table data, form inputs, and database results — ensure
pii-detectorcovers the data types your apps handle. - Validate: run an AI query in a test Retool app and confirm the event in the Keeptrusts console.
For leaders
- Retool internal tools frequently process sensitive customer data, financial records, and operational databases. AI features on these tools send this context to external LLM providers. Routing through Keeptrusts ensures PII redaction, secret scanning, and full audit logging.
- Audit trails for internal tool AI usage support SOC 2, HIPAA, and internal compliance requirements.
- Centralized policy enforcement prevents individual teams from bypassing data governance when building AI-powered internal tools.
- The
secret-scannerpolicy prevents database credentials and API keys from leaking through AI queries.
Next steps
- PostHog AI integration — govern product analytics AI features
- Grafana LLM integration — govern observability AI features
- Policy controls catalog — full policy reference
- Quickstart — install
ktand run your first gateway - Access keys — manage gateway access credentials