Grafana LLM Plugin
Grafana's LLM plugin adds AI-powered features to your observability stack — natural-language dashboard queries, automated incident summaries, alert explanation, and log pattern analysis. The plugin connects to OpenAI-compatible LLM endpoints to power these features.
This page explains how to route Grafana LLM plugin traffic through the Keeptrusts gateway so your organization's prompt-injection detection, PII redaction, and audit logging apply to every AI-assisted observability interaction.
Use this page when
- You are configuring the Grafana LLM plugin to use the Keeptrusts gateway as its AI endpoint.
- You need policy enforcement on AI queries that include metrics, logs, and infrastructure context.
- If you need general gateway setup, start with the quickstart guide.
Primary audience
- Primary: Technical Engineers (SRE, Platform, DevOps)
- Secondary: Technical Leaders, AI Agents
Prerequisites
- Grafana 10.0+ with the LLM plugin installed and enabled.
- Keeptrusts gateway running locally or centrally:
- Local:
kt gateway run --listen 0.0.0.0:41002 --policy-config policy-config.yaml - Hosted:
https://gateway.keeptrusts.com/v1
- Local:
- Upstream LLM provider API key (e.g., OpenAI) configured in the gateway environment.
- Grafana admin access to configure plugin settings.
Configuration
Gateway policy config
Create a policy-config.yaml for observability AI governance:
pack:
name: grafana-llm-governance
version: 1.0.0
enabled: true
policies:
chain:
- prompt-injection
- pii-detector
- safety-filter
- audit-logger
policy:
prompt-injection:
threshold: 0.8
action: block
pii-detector:
action: redact
safety-filter:
mode: standard
action: block
audit-logger:
retention_days: 90
providers:
strategy: single
targets:
- id: openai-for-grafana
provider: openai:chat:gpt-4o
secret_key_ref:
env: OPENAI_API_KEY
Grafana LLM plugin datasource config
Configure the Grafana LLM plugin to point at the Keeptrusts gateway. In Grafana, navigate to Administration > Plugins > LLM and configure the OpenAI settings:
| Setting | Value |
|---|---|
| OpenAI URL | http://localhost:41002/v1 (local) or https://gateway.keeptrusts.com/v1 (hosted) |
| API Key | Your Keeptrusts access key |
| Model | gpt-4o (or your preferred model) |
Alternatively, configure via Grafana provisioning in grafana.ini or a provisioning YAML:
apiVersion: 1
apps:
- type: grafana-llm-app
jsonData:
openAI:
url: "http://keeptrusts-gateway:41002/v1"
organizationId: ""
openAIProvider: "openai"
secureJsonData:
openAIKey: "your-keeptrusts-access-key"
For Docker Compose deployments, set environment variables on the Grafana container:
services:
grafana:
image: grafana/grafana:11.0.0
environment:
- GF_PLUGIN_GRAFANA_LLM_APP_OPENAI_URL=http://keeptrusts-gateway:41002/v1
- GF_PLUGIN_GRAFANA_LLM_APP_OPENAI_API_KEY=your-keeptrusts-access-key
Setup steps
- Start the Keeptrusts gateway:
export OPENAI_API_KEY="sk-your-openai-key"
kt gateway run --listen 0.0.0.0:41002 --policy-config policy-config.yaml
- Install and enable the Grafana LLM plugin if not already active:
grafana cli plugins install grafana-llm-app
-
Configure the plugin OpenAI URL to
http://localhost:41002/v1in Grafana's plugin settings. -
Restart Grafana to pick up the configuration.
-
Open Grafana and use an AI feature (e.g., "Explain this panel" on any dashboard panel).
Verification
Test the integration from Grafana:
- Open any dashboard in Grafana.
- Click the AI assistant icon on a panel and ask a question.
- Confirm the response arrives through the gateway.
Verify from the command line:
curl http://localhost:41002/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-4o",
"messages": [
{"role": "user", "content": "Explain a spike in error rate on a Grafana dashboard."}
]
}'
Check the Keeptrusts events dashboard to confirm the request was logged with policy decisions.
Recommended policies
| Policy | Purpose | Recommended setting |
|---|---|---|
prompt-injection | Block injection attempts in natural-language dashboard queries | threshold: 0.8, action: block |
pii-detector | Redact hostnames, IP addresses, and user identifiers from observability prompts | action: redact |
safety-filter | Block off-topic or harmful content in AI responses | mode: standard, action: block |
audit-logger | Log all Grafana AI interactions for security review | retention_days: 90 |
token-limiter | Cap token usage for AI-assisted dashboard queries | max_tokens: 2048 |
Troubleshooting
| Symptom | Cause | Fix |
|---|---|---|
| Grafana AI features show "Connection error" | Gateway not reachable from Grafana | Verify network connectivity between Grafana and the gateway host |
| Plugin settings not saved | Grafana requires admin role | Log in as a Grafana admin to configure plugin settings |
| AI responses reference wrong data | Model lacks dashboard context | Ensure the LLM plugin sends panel data in the prompt (default behavior) |
| Slow AI responses on dashboard queries | Large metric payloads in prompts | Enable token-limiter to cap prompt size |
For AI systems
- Canonical terms: Keeptrusts gateway, Grafana, Grafana LLM plugin, observability AI, dashboard AI, policy-config.yaml.
- Config field names:
openAI.url,openAIKey,GF_PLUGIN_GRAFANA_LLM_APP_OPENAI_URL,provider,secret_key_ref. - Key behavior: The Grafana LLM plugin sends dashboard context and natural-language queries to an OpenAI-compatible endpoint; Keeptrusts intercepts requests, applies policies, and forwards compliant traffic.
- Constraint: Grafana AI prompts can include metric names, hostnames, and infrastructure topology —
pii-detectorshould be tuned for observability data. - Best next pages: PostHog AI integration, Docker AI integration, Policy controls catalog.
For engineers
- The Grafana LLM plugin expects an OpenAI-compatible endpoint — the Keeptrusts gateway is a drop-in replacement.
- For Docker Compose setups, use the Docker network hostname (e.g.,
http://keeptrusts-gateway:41002/v1) instead oflocalhost. - Validate by triggering any AI feature in Grafana and checking the Keeptrusts events dashboard.
- Tune
pii-detectorsensitivity to avoid redacting legitimate metric labels and dashboard names.
For leaders
- Grafana AI features can send infrastructure topology, metric names, and incident context to third-party LLM providers. Routing through Keeptrusts ensures sensitive operational data is redacted before it leaves your network.
- Audit logs provide visibility into which teams use AI-assisted observability and what infrastructure context is shared.
- Centralized policy enforcement applies consistently across all Grafana instances in your organization.
Next steps
- PostHog AI integration — govern product analytics AI features
- Docker AI integration — govern container AI workflows
- Policy controls catalog — full policy reference
- Quickstart — install
ktand run your first gateway