PostHog AI
PostHog includes AI-powered features for product analytics — natural-language insight queries, AI-generated SQL, automated funnel analysis, and session replay summaries. These features send analytics context, user behavior data, and natural-language queries to upstream LLM providers. When PostHog connects to external AI endpoints for these capabilities, your product data flows to third-party providers.
This page explains how to route PostHog's LLM-connected features through the Keeptrusts gateway so policy enforcement, PII redaction, and audit logging apply to every AI-assisted analytics interaction.
Use this page when
- You are configuring PostHog's AI features to use the Keeptrusts gateway as the LLM endpoint.
- You need policy enforcement on AI queries that include product analytics data and user behavior context.
- If you need general gateway setup, start with the quickstart guide.
Primary audience
- Primary: Technical Engineers (Product, Data, Platform)
- Secondary: Technical Leaders, AI Agents
Prerequisites
- PostHog instance (self-hosted or PostHog Cloud) with AI features enabled.
- PostHog admin access to configure AI integrations.
- Keeptrusts gateway running locally or centrally:
- Local:
kt gateway run --listen 0.0.0.0:41002 --policy-config policy-config.yaml - Hosted:
https://gateway.keeptrusts.com/v1
- Local:
- Upstream LLM provider API key configured in the gateway environment.
Configuration
Gateway policy config
Create a policy-config.yaml for product analytics AI governance:
pack:
name: posthog-ai-governance
version: 1.0.0
enabled: true
policies:
chain:
- prompt-injection
- pii-detector
- safety-filter
- audit-logger
policy:
prompt-injection:
threshold: 0.8
action: block
pii-detector:
action: redact
safety-filter:
mode: standard
action: block
audit-logger:
retention_days: 90
providers:
strategy: single
targets:
- id: openai-for-posthog
provider: openai:chat:gpt-4o
secret_key_ref:
env: OPENAI_API_KEY
Self-hosted PostHog configuration
For self-hosted PostHog, configure the OpenAI endpoint to point at the Keeptrusts gateway. Set the following environment variables in your PostHog deployment:
OPENAI_API_BASE="http://keeptrusts-gateway:41002/v1"
OPENAI_API_KEY="your-keeptrusts-access-key"
For Docker Compose deployments, add these to the PostHog web service:
services:
posthog-web:
image: posthog/posthog:latest
environment:
- OPENAI_API_BASE=http://keeptrusts-gateway:41002/v1
- OPENAI_API_KEY=your-keeptrusts-access-key
For Kubernetes/Helm deployments, set the values in your PostHog Helm values file:
env:
- name: OPENAI_API_BASE
value: "http://keeptrusts-gateway.keeptrusts.svc:41002/v1"
- name: OPENAI_API_KEY
valueFrom:
secretKeyRef:
name: keeptrusts-access-key
key: api-key
PostHog Cloud configuration
For PostHog Cloud, AI features use PostHog's managed LLM connections. To route these through the Keeptrusts gateway, configure a custom AI provider in PostHog's project settings under Settings > AI & Integrations, setting the API endpoint to your Keeptrusts hosted gateway:
| Setting | Value |
|---|---|
| API Base URL | https://gateway.keeptrusts.com/v1 |
| API Key | Your Keeptrusts access key |
| Model | gpt-4o |
Setup steps
- Start the Keeptrusts gateway:
export OPENAI_API_KEY="sk-your-openai-key"
kt gateway run --listen 0.0.0.0:41002 --policy-config policy-config.yaml
-
Configure PostHog to use the gateway endpoint via environment variables or project settings.
-
Restart PostHog services to pick up the new configuration.
-
Use a PostHog AI feature (e.g., "Ask AI" in the insights panel) to test the connection.
Verification
Test from PostHog:
- Open PostHog and navigate to any insight or dashboard.
- Use the "Ask AI" or natural-language query feature.
- Confirm the response arrives and the request appears in the Keeptrusts events dashboard.
Test from the command line:
curl http://localhost:41002/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-4o",
"messages": [
{"role": "user", "content": "Analyze user retention trends for the last 30 days."}
]
}'
Recommended policies
| Policy | Purpose | Recommended setting |
|---|---|---|
prompt-injection | Block injection in natural-language analytics queries | threshold: 0.8, action: block |
pii-detector | Redact user IDs, emails, and behavioral data from analytics prompts | action: redact |
safety-filter | Block off-topic or harmful content in AI responses | mode: standard, action: block |
audit-logger | Log all PostHog AI interactions | retention_days: 90 |
token-limiter | Cap token usage for complex analytics queries | max_tokens: 4096 |
Troubleshooting
| Symptom | Cause | Fix |
|---|---|---|
| PostHog AI features show error | Gateway not reachable from PostHog | Verify network connectivity and correct gateway URL |
| AI responses lack analytics context | PostHog sends limited context | This is expected — PostHog controls what context it sends to the LLM |
| Slow AI responses | Large analytics payloads | Enable token-limiter to cap prompt size |
| Environment variables not picked up | PostHog not restarted | Restart PostHog services after changing env vars |
For AI systems
- Canonical terms: Keeptrusts gateway, PostHog, product analytics AI, natural-language queries, policy-config.yaml.
- Config field names:
OPENAI_API_BASE,OPENAI_API_KEY,provider,secret_key_ref,audit-logger. - Key behavior: PostHog AI features send analytics context and natural-language queries to an OpenAI-compatible endpoint; Keeptrusts intercepts requests, applies policies, and forwards compliant traffic.
- Constraint: PostHog controls what analytics context it includes in prompts — the gateway governs the LLM communication, not PostHog's internal data access.
- Best next pages: Grafana LLM integration, Retool AI integration, Policy controls catalog.
For engineers
- For self-hosted PostHog, set
OPENAI_API_BASEas an environment variable on the PostHog web service. - For Docker Compose, use Docker network hostnames (e.g.,
http://keeptrusts-gateway:41002/v1). - PostHog AI features include product analytics context in prompts — ensure
pii-detectoris configured to catch user identifiers. - Validate: trigger an AI query in PostHog and confirm the event in the Keeptrusts console.
For leaders
- PostHog AI features can send user behavioral data, event properties, and analytics context to third-party LLM providers. Routing through Keeptrusts ensures PII is redacted and every interaction is logged.
- Audit logs support GDPR, SOC 2, and internal data governance requirements for product analytics AI usage.
- Centralized policy enforcement applies across all PostHog instances and projects in your organization.
Next steps
- Grafana LLM integration — govern observability AI features
- Retool AI integration — govern internal tool AI features
- Policy controls catalog — full policy reference
- Quickstart — install
ktand run your first gateway