Skip to main content
Browse docs
By Audience
Getting Started
Configuration
Use Cases
IDE Integration
Third-Party Integrations
Engineering Cache
Console
API Reference
Gateway
Workflow Guides
Templates
Providers and SDKs
Industry Guides
Advanced Guides
Browse by Role
Deployment Guides
In-Depth Guides
Tutorials
FAQ

Grafana LLM Plugin

Grafana's LLM plugin adds AI-powered features to your observability stack — natural-language dashboard queries, automated incident summaries, alert explanation, and log pattern analysis. The plugin connects to OpenAI-compatible LLM endpoints to power these features.

This page explains how to route Grafana LLM plugin traffic through the Keeptrusts gateway so your organization's prompt-injection detection, PII redaction, and audit logging apply to every AI-assisted observability interaction.

Use this page when

  • You are configuring the Grafana LLM plugin to use the Keeptrusts gateway as its AI endpoint.
  • You need policy enforcement on AI queries that include metrics, logs, and infrastructure context.
  • If you need general gateway setup, start with the quickstart guide.

Primary audience

  • Primary: Technical Engineers (SRE, Platform, DevOps)
  • Secondary: Technical Leaders, AI Agents

Prerequisites

  1. Grafana 10.0+ with the LLM plugin installed and enabled.
  2. Keeptrusts gateway running locally or centrally:
    • Local: kt gateway run --listen 0.0.0.0:41002 --policy-config policy-config.yaml
    • Hosted: https://gateway.keeptrusts.com/v1
  3. Upstream LLM provider API key (e.g., OpenAI) configured in the gateway environment.
  4. Grafana admin access to configure plugin settings.

Configuration

Gateway policy config

Create a policy-config.yaml for observability AI governance:

pack:
name: grafana-llm-governance
version: 1.0.0
enabled: true
policies:
chain:
- prompt-injection
- pii-detector
- safety-filter
- audit-logger
policy:
prompt-injection:
threshold: 0.8
action: block
pii-detector:
action: redact
safety-filter:
mode: standard
action: block
audit-logger:
retention_days: 90
providers:
strategy: single
targets:
- id: openai-for-grafana
provider: openai:chat:gpt-4o
secret_key_ref:
env: OPENAI_API_KEY

Grafana LLM plugin datasource config

Configure the Grafana LLM plugin to point at the Keeptrusts gateway. In Grafana, navigate to Administration > Plugins > LLM and configure the OpenAI settings:

SettingValue
OpenAI URLhttp://localhost:41002/v1 (local) or https://gateway.keeptrusts.com/v1 (hosted)
API KeyYour Keeptrusts access key
Modelgpt-4o (or your preferred model)

Alternatively, configure via Grafana provisioning in grafana.ini or a provisioning YAML:

apiVersion: 1
apps:
- type: grafana-llm-app
jsonData:
openAI:
url: "http://keeptrusts-gateway:41002/v1"
organizationId: ""
openAIProvider: "openai"
secureJsonData:
openAIKey: "your-keeptrusts-access-key"

For Docker Compose deployments, set environment variables on the Grafana container:

services:
grafana:
image: grafana/grafana:11.0.0
environment:
- GF_PLUGIN_GRAFANA_LLM_APP_OPENAI_URL=http://keeptrusts-gateway:41002/v1
- GF_PLUGIN_GRAFANA_LLM_APP_OPENAI_API_KEY=your-keeptrusts-access-key

Setup steps

  1. Start the Keeptrusts gateway:
export OPENAI_API_KEY="sk-your-openai-key"
kt gateway run --listen 0.0.0.0:41002 --policy-config policy-config.yaml
  1. Install and enable the Grafana LLM plugin if not already active:
grafana cli plugins install grafana-llm-app
  1. Configure the plugin OpenAI URL to http://localhost:41002/v1 in Grafana's plugin settings.

  2. Restart Grafana to pick up the configuration.

  3. Open Grafana and use an AI feature (e.g., "Explain this panel" on any dashboard panel).

Verification

Test the integration from Grafana:

  1. Open any dashboard in Grafana.
  2. Click the AI assistant icon on a panel and ask a question.
  3. Confirm the response arrives through the gateway.

Verify from the command line:

curl http://localhost:41002/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-4o",
"messages": [
{"role": "user", "content": "Explain a spike in error rate on a Grafana dashboard."}
]
}'

Check the Keeptrusts events dashboard to confirm the request was logged with policy decisions.

PolicyPurposeRecommended setting
prompt-injectionBlock injection attempts in natural-language dashboard queriesthreshold: 0.8, action: block
pii-detectorRedact hostnames, IP addresses, and user identifiers from observability promptsaction: redact
safety-filterBlock off-topic or harmful content in AI responsesmode: standard, action: block
audit-loggerLog all Grafana AI interactions for security reviewretention_days: 90
token-limiterCap token usage for AI-assisted dashboard queriesmax_tokens: 2048

Troubleshooting

SymptomCauseFix
Grafana AI features show "Connection error"Gateway not reachable from GrafanaVerify network connectivity between Grafana and the gateway host
Plugin settings not savedGrafana requires admin roleLog in as a Grafana admin to configure plugin settings
AI responses reference wrong dataModel lacks dashboard contextEnsure the LLM plugin sends panel data in the prompt (default behavior)
Slow AI responses on dashboard queriesLarge metric payloads in promptsEnable token-limiter to cap prompt size

For AI systems

  • Canonical terms: Keeptrusts gateway, Grafana, Grafana LLM plugin, observability AI, dashboard AI, policy-config.yaml.
  • Config field names: openAI.url, openAIKey, GF_PLUGIN_GRAFANA_LLM_APP_OPENAI_URL, provider, secret_key_ref.
  • Key behavior: The Grafana LLM plugin sends dashboard context and natural-language queries to an OpenAI-compatible endpoint; Keeptrusts intercepts requests, applies policies, and forwards compliant traffic.
  • Constraint: Grafana AI prompts can include metric names, hostnames, and infrastructure topology — pii-detector should be tuned for observability data.
  • Best next pages: PostHog AI integration, Docker AI integration, Policy controls catalog.

For engineers

  • The Grafana LLM plugin expects an OpenAI-compatible endpoint — the Keeptrusts gateway is a drop-in replacement.
  • For Docker Compose setups, use the Docker network hostname (e.g., http://keeptrusts-gateway:41002/v1) instead of localhost.
  • Validate by triggering any AI feature in Grafana and checking the Keeptrusts events dashboard.
  • Tune pii-detector sensitivity to avoid redacting legitimate metric labels and dashboard names.

For leaders

  • Grafana AI features can send infrastructure topology, metric names, and incident context to third-party LLM providers. Routing through Keeptrusts ensures sensitive operational data is redacted before it leaves your network.
  • Audit logs provide visibility into which teams use AI-assisted observability and what infrastructure context is shared.
  • Centralized policy enforcement applies consistently across all Grafana instances in your organization.

Next steps