Skip to main content
Browse docs
By Audience
Getting Started
Configuration
Use Cases
IDE Integration
Third-Party Integrations
Engineering Cache
Console
API Reference
Gateway
Workflow Guides
Templates
Providers and SDKs
Industry Guides
Advanced Guides
Browse by Role
Deployment Guides
In-Depth Guides
Tutorials
FAQ

PostHog AI

PostHog includes AI-powered features for product analytics — natural-language insight queries, AI-generated SQL, automated funnel analysis, and session replay summaries. These features send analytics context, user behavior data, and natural-language queries to upstream LLM providers. When PostHog connects to external AI endpoints for these capabilities, your product data flows to third-party providers.

This page explains how to route PostHog's LLM-connected features through the Keeptrusts gateway so policy enforcement, PII redaction, and audit logging apply to every AI-assisted analytics interaction.

Use this page when

  • You are configuring PostHog's AI features to use the Keeptrusts gateway as the LLM endpoint.
  • You need policy enforcement on AI queries that include product analytics data and user behavior context.
  • If you need general gateway setup, start with the quickstart guide.

Primary audience

  • Primary: Technical Engineers (Product, Data, Platform)
  • Secondary: Technical Leaders, AI Agents

Prerequisites

  1. PostHog instance (self-hosted or PostHog Cloud) with AI features enabled.
  2. PostHog admin access to configure AI integrations.
  3. Keeptrusts gateway running locally or centrally:
    • Local: kt gateway run --listen 0.0.0.0:41002 --policy-config policy-config.yaml
    • Hosted: https://gateway.keeptrusts.com/v1
  4. Upstream LLM provider API key configured in the gateway environment.

Configuration

Gateway policy config

Create a policy-config.yaml for product analytics AI governance:

pack:
name: posthog-ai-governance
version: 1.0.0
enabled: true
policies:
chain:
- prompt-injection
- pii-detector
- safety-filter
- audit-logger
policy:
prompt-injection:
threshold: 0.8
action: block
pii-detector:
action: redact
safety-filter:
mode: standard
action: block
audit-logger:
retention_days: 90
providers:
strategy: single
targets:
- id: openai-for-posthog
provider: openai:chat:gpt-4o
secret_key_ref:
env: OPENAI_API_KEY

Self-hosted PostHog configuration

For self-hosted PostHog, configure the OpenAI endpoint to point at the Keeptrusts gateway. Set the following environment variables in your PostHog deployment:

OPENAI_API_BASE="http://keeptrusts-gateway:41002/v1"
OPENAI_API_KEY="your-keeptrusts-access-key"

For Docker Compose deployments, add these to the PostHog web service:

services:
posthog-web:
image: posthog/posthog:latest
environment:
- OPENAI_API_BASE=http://keeptrusts-gateway:41002/v1
- OPENAI_API_KEY=your-keeptrusts-access-key

For Kubernetes/Helm deployments, set the values in your PostHog Helm values file:

env:
- name: OPENAI_API_BASE
value: "http://keeptrusts-gateway.keeptrusts.svc:41002/v1"
- name: OPENAI_API_KEY
valueFrom:
secretKeyRef:
name: keeptrusts-access-key
key: api-key

PostHog Cloud configuration

For PostHog Cloud, AI features use PostHog's managed LLM connections. To route these through the Keeptrusts gateway, configure a custom AI provider in PostHog's project settings under Settings > AI & Integrations, setting the API endpoint to your Keeptrusts hosted gateway:

SettingValue
API Base URLhttps://gateway.keeptrusts.com/v1
API KeyYour Keeptrusts access key
Modelgpt-4o

Setup steps

  1. Start the Keeptrusts gateway:
export OPENAI_API_KEY="sk-your-openai-key"
kt gateway run --listen 0.0.0.0:41002 --policy-config policy-config.yaml
  1. Configure PostHog to use the gateway endpoint via environment variables or project settings.

  2. Restart PostHog services to pick up the new configuration.

  3. Use a PostHog AI feature (e.g., "Ask AI" in the insights panel) to test the connection.

Verification

Test from PostHog:

  1. Open PostHog and navigate to any insight or dashboard.
  2. Use the "Ask AI" or natural-language query feature.
  3. Confirm the response arrives and the request appears in the Keeptrusts events dashboard.

Test from the command line:

curl http://localhost:41002/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-4o",
"messages": [
{"role": "user", "content": "Analyze user retention trends for the last 30 days."}
]
}'
PolicyPurposeRecommended setting
prompt-injectionBlock injection in natural-language analytics queriesthreshold: 0.8, action: block
pii-detectorRedact user IDs, emails, and behavioral data from analytics promptsaction: redact
safety-filterBlock off-topic or harmful content in AI responsesmode: standard, action: block
audit-loggerLog all PostHog AI interactionsretention_days: 90
token-limiterCap token usage for complex analytics queriesmax_tokens: 4096

Troubleshooting

SymptomCauseFix
PostHog AI features show errorGateway not reachable from PostHogVerify network connectivity and correct gateway URL
AI responses lack analytics contextPostHog sends limited contextThis is expected — PostHog controls what context it sends to the LLM
Slow AI responsesLarge analytics payloadsEnable token-limiter to cap prompt size
Environment variables not picked upPostHog not restartedRestart PostHog services after changing env vars

For AI systems

  • Canonical terms: Keeptrusts gateway, PostHog, product analytics AI, natural-language queries, policy-config.yaml.
  • Config field names: OPENAI_API_BASE, OPENAI_API_KEY, provider, secret_key_ref, audit-logger.
  • Key behavior: PostHog AI features send analytics context and natural-language queries to an OpenAI-compatible endpoint; Keeptrusts intercepts requests, applies policies, and forwards compliant traffic.
  • Constraint: PostHog controls what analytics context it includes in prompts — the gateway governs the LLM communication, not PostHog's internal data access.
  • Best next pages: Grafana LLM integration, Retool AI integration, Policy controls catalog.

For engineers

  • For self-hosted PostHog, set OPENAI_API_BASE as an environment variable on the PostHog web service.
  • For Docker Compose, use Docker network hostnames (e.g., http://keeptrusts-gateway:41002/v1).
  • PostHog AI features include product analytics context in prompts — ensure pii-detector is configured to catch user identifiers.
  • Validate: trigger an AI query in PostHog and confirm the event in the Keeptrusts console.

For leaders

  • PostHog AI features can send user behavioral data, event properties, and analytics context to third-party LLM providers. Routing through Keeptrusts ensures PII is redacted and every interaction is logged.
  • Audit logs support GDPR, SOC 2, and internal data governance requirements for product analytics AI usage.
  • Centralized policy enforcement applies across all PostHog instances and projects in your organization.

Next steps