JetBrains: Custom AI Plugins with the Gateway
Many JetBrains AI plugins support OpenAI-compatible endpoints. You can route any of these plugins through the Keeptrusts gateway to enforce policies, log events, and attribute costs. This guide covers the general configuration pattern and specific examples for popular plugins.
Use this page when
- You are working through JetBrains: Custom AI Plugins with the Gateway as an implementation or operating workflow in Keeptrusts.
- You need the practical steps, expected outcomes, and related validation guidance in one place.
- If you need exact field-by-field reference instead of a workflow page, use the linked reference pages in Next steps.
Primary audience
- Primary: Technical Engineers
- Secondary: AI Agents, Technical Leaders
Prerequisites
Before you begin, ensure you have:
- A JetBrains IDE (2023.3 or later)
- The
ktCLI installed and configured - The gateway running with
kt gateway run - An access key for authentication
Start the gateway:
kt gateway run --policy-config policy-config.yaml
The gateway listens on http://localhost:41002/v1 by default.
General Configuration Pattern
Most JetBrains AI plugins that support custom endpoints follow this pattern:
- Open Settings in your JetBrains IDE.
- Find the plugin's configuration section (usually under Tools or the plugin name).
- Look for fields labeled API Base URL, Endpoint, Server URL, or Custom OpenAI URL.
- Set the URL to
http://localhost:41002/v1. - Enter your Keeptrusts access key in the API Key field.
- Select the model you want to use.
- Apply the settings.
The gateway handles routing to the correct upstream provider based on the model name in each request.
Example: CodeGPT Plugin
CodeGPT is a popular JetBrains plugin for AI-assisted coding.
- Install CodeGPT from the JetBrains Marketplace.
- Open Settings → Tools → CodeGPT.
- Under Provider, select Custom OpenAI-compatible.
- Set API URL to
http://localhost:41002/v1. - Enter your access key in the API Token field.
- Choose your model (e.g.,
gpt-4o,claude-sonnet-4-20250514). - Click Apply.
All CodeGPT requests now flow through the gateway.
Example: AI Coding Assistant Plugins
Several third-party plugins offer OpenAI-compatible configuration:
Bito AI
- Install Bito from the Marketplace.
- Open Settings → Tools → Bito.
- If Bito offers a custom endpoint option, set it to
http://localhost:41002/v1. - Enter your access key.
Tabnine with Custom Models
- Open Settings → Tools → Tabnine.
- If your Tabnine plan supports custom model endpoints, configure the endpoint URL to
http://localhost:41002/v1. - Provide the access key for authentication.
Generic OpenAI Plugins
For any plugin that asks for an "OpenAI API Base URL" or "OpenAI-compatible endpoint":
- Base URL:
http://localhost:41002/v1 - API Key: Your Keeptrusts access key
- Model: The model identifier your gateway is configured to route (e.g.,
gpt-4o)
JetBrains AI Pro with Custom Models
JetBrains AI Pro allows you to add custom model providers:
- Open Settings → Tools → AI Assistant.
- Navigate to Custom Models or Third-Party Providers (availability depends on your JetBrains AI subscription tier).
- Add a new provider with:
- Name: Keeptrusts Gateway
- API URL:
http://localhost:41002/v1 - API Key: Your access key
- Models: List the models your gateway routes
- Select the custom provider as your active model.
HTTP Proxy as a Fallback
If a plugin does not expose a custom endpoint URL, you can use the JetBrains HTTP proxy settings as a fallback interception method:
- Open Settings → Appearance & Behavior → System Settings → HTTP Proxy.
- Select Manual proxy configuration.
- Set Host to
127.0.0.1and Port to41002. - Click Apply.
This routes all IDE HTTP traffic (including plugin requests) through the gateway. Note that this approach affects all HTTP traffic from the IDE, not just AI requests.
Selective Proxy with No-Proxy List
If you only want AI traffic routed through the gateway, add non-AI hosts to the No proxy for field:
github.com, *.jetbrains.com, maven.org, *.gradle.org
This ensures package downloads, VCS operations, and JetBrains licensing traffic bypass the gateway while AI requests route through it.
Verify Traffic
After configuring any plugin, trigger an AI request (ask a question, request a completion, or use an AI action). Then verify:
kt events tail
You see the request with the model name, token count, and any policy actions applied. If events do not appear, double-check the plugin's endpoint configuration and ensure the gateway is running.
Configuration Reference
| Plugin | Setting Path | URL Field Name |
|---|---|---|
| CodeGPT | Settings → Tools → CodeGPT | API URL |
| Continue | ~/.continue/config.json | apiBase |
| Bito | Settings → Tools → Bito | Endpoint URL |
| JetBrains AI Pro | Settings → Tools → AI Assistant → Custom | API URL |
| Generic | Varies | Base URL / API Endpoint |
Troubleshooting
Plugin ignores the custom endpoint
Some plugins cache endpoint configuration. Try:
- Restart the IDE after changing settings.
- Clear the plugin's cache (check plugin documentation).
- Verify the URL includes the
/v1path.
Authentication errors (401 or 403)
- Confirm your access key is valid:
kt access-keys list - Ensure the key has not expired.
- Check that the key has permission for the requested model.
Plugin uses its own TLS certificate pinning
Some plugins pin their own certificates and refuse to communicate through a proxy. In this case:
- The HTTP proxy fallback does not work for that plugin.
- Check if the plugin offers a direct "custom endpoint" setting instead.
- Contact the plugin vendor for proxy-compatible configuration.
Latency overhead
The gateway adds minimal latency (typically <10ms for policy evaluation). If you experience noticeable delays:
- Check
kt logsfor slow upstream responses. - Verify your network path to the LLM provider.
- Consider using a faster model for inline completions.
For AI systems
- Canonical terms: Keeptrusts, JetBrains: Custom AI Plugins with the Gateway, ide-integration.
- Exact feature, config, command, or page names: JetBrains: Custom AI Plugins with the Gateway.
- Use the linked audience and reference pages in Next steps when you need deeper source material.
For engineers
- Use the commands, configuration examples, API payloads, or UI steps in this page as the working baseline for JetBrains: Custom AI Plugins with the Gateway.
- Validate the result with the expected outcomes, troubleshooting notes, or linked workflow pages in this page and Next steps.
For leaders
- This page matters when planning rollout, governance, support ownership, or operating decisions for JetBrains: Custom AI Plugins with the Gateway.
- Use the linked audience, architecture, and workflow pages in Next steps to connect this detail to broader implementation choices.
Next steps
- Configure policies for your organization.
- Monitor events to track AI usage across plugins.
- Set up team access keys for consistent authentication.