Skip to main content
Browse docs
By Audience
Getting Started
Configuration
Use Cases
IDE Integration
Third-Party Integrations
Engineering Cache
Console
API Reference
Gateway
Workflow Guides
Templates
Providers and SDKs
Industry Guides
Advanced Guides
Browse by Role
Deployment Guides
In-Depth Guides
Tutorials
FAQ

JetBrains: Custom AI Plugins with the Gateway

Many JetBrains AI plugins support OpenAI-compatible endpoints. You can route any of these plugins through the Keeptrusts gateway to enforce policies, log events, and attribute costs. This guide covers the general configuration pattern and specific examples for popular plugins.

Use this page when

  • You are working through JetBrains: Custom AI Plugins with the Gateway as an implementation or operating workflow in Keeptrusts.
  • You need the practical steps, expected outcomes, and related validation guidance in one place.
  • If you need exact field-by-field reference instead of a workflow page, use the linked reference pages in Next steps.

Primary audience

  • Primary: Technical Engineers
  • Secondary: AI Agents, Technical Leaders

Prerequisites

Before you begin, ensure you have:

  • A JetBrains IDE (2023.3 or later)
  • The kt CLI installed and configured
  • The gateway running with kt gateway run
  • An access key for authentication

Start the gateway:

kt gateway run --policy-config policy-config.yaml

The gateway listens on http://localhost:41002/v1 by default.

General Configuration Pattern

Most JetBrains AI plugins that support custom endpoints follow this pattern:

  1. Open Settings in your JetBrains IDE.
  2. Find the plugin's configuration section (usually under Tools or the plugin name).
  3. Look for fields labeled API Base URL, Endpoint, Server URL, or Custom OpenAI URL.
  4. Set the URL to http://localhost:41002/v1.
  5. Enter your Keeptrusts access key in the API Key field.
  6. Select the model you want to use.
  7. Apply the settings.

The gateway handles routing to the correct upstream provider based on the model name in each request.

Example: CodeGPT Plugin

CodeGPT is a popular JetBrains plugin for AI-assisted coding.

  1. Install CodeGPT from the JetBrains Marketplace.
  2. Open Settings → Tools → CodeGPT.
  3. Under Provider, select Custom OpenAI-compatible.
  4. Set API URL to http://localhost:41002/v1.
  5. Enter your access key in the API Token field.
  6. Choose your model (e.g., gpt-4o, claude-sonnet-4-20250514).
  7. Click Apply.

All CodeGPT requests now flow through the gateway.

Example: AI Coding Assistant Plugins

Several third-party plugins offer OpenAI-compatible configuration:

Bito AI

  1. Install Bito from the Marketplace.
  2. Open Settings → Tools → Bito.
  3. If Bito offers a custom endpoint option, set it to http://localhost:41002/v1.
  4. Enter your access key.

Tabnine with Custom Models

  1. Open Settings → Tools → Tabnine.
  2. If your Tabnine plan supports custom model endpoints, configure the endpoint URL to http://localhost:41002/v1.
  3. Provide the access key for authentication.

Generic OpenAI Plugins

For any plugin that asks for an "OpenAI API Base URL" or "OpenAI-compatible endpoint":

  • Base URL: http://localhost:41002/v1
  • API Key: Your Keeptrusts access key
  • Model: The model identifier your gateway is configured to route (e.g., gpt-4o)

JetBrains AI Pro with Custom Models

JetBrains AI Pro allows you to add custom model providers:

  1. Open Settings → Tools → AI Assistant.
  2. Navigate to Custom Models or Third-Party Providers (availability depends on your JetBrains AI subscription tier).
  3. Add a new provider with:
    • Name: Keeptrusts Gateway
    • API URL: http://localhost:41002/v1
    • API Key: Your access key
    • Models: List the models your gateway routes
  4. Select the custom provider as your active model.

HTTP Proxy as a Fallback

If a plugin does not expose a custom endpoint URL, you can use the JetBrains HTTP proxy settings as a fallback interception method:

  1. Open Settings → Appearance & Behavior → System Settings → HTTP Proxy.
  2. Select Manual proxy configuration.
  3. Set Host to 127.0.0.1 and Port to 41002.
  4. Click Apply.

This routes all IDE HTTP traffic (including plugin requests) through the gateway. Note that this approach affects all HTTP traffic from the IDE, not just AI requests.

Selective Proxy with No-Proxy List

If you only want AI traffic routed through the gateway, add non-AI hosts to the No proxy for field:

github.com, *.jetbrains.com, maven.org, *.gradle.org

This ensures package downloads, VCS operations, and JetBrains licensing traffic bypass the gateway while AI requests route through it.

Verify Traffic

After configuring any plugin, trigger an AI request (ask a question, request a completion, or use an AI action). Then verify:

kt events tail

You see the request with the model name, token count, and any policy actions applied. If events do not appear, double-check the plugin's endpoint configuration and ensure the gateway is running.

Configuration Reference

PluginSetting PathURL Field Name
CodeGPTSettings → Tools → CodeGPTAPI URL
Continue~/.continue/config.jsonapiBase
BitoSettings → Tools → BitoEndpoint URL
JetBrains AI ProSettings → Tools → AI Assistant → CustomAPI URL
GenericVariesBase URL / API Endpoint

Troubleshooting

Plugin ignores the custom endpoint

Some plugins cache endpoint configuration. Try:

  • Restart the IDE after changing settings.
  • Clear the plugin's cache (check plugin documentation).
  • Verify the URL includes the /v1 path.

Authentication errors (401 or 403)

  • Confirm your access key is valid: kt access-keys list
  • Ensure the key has not expired.
  • Check that the key has permission for the requested model.

Plugin uses its own TLS certificate pinning

Some plugins pin their own certificates and refuse to communicate through a proxy. In this case:

  • The HTTP proxy fallback does not work for that plugin.
  • Check if the plugin offers a direct "custom endpoint" setting instead.
  • Contact the plugin vendor for proxy-compatible configuration.

Latency overhead

The gateway adds minimal latency (typically <10ms for policy evaluation). If you experience noticeable delays:

  • Check kt logs for slow upstream responses.
  • Verify your network path to the LLM provider.
  • Consider using a faster model for inline completions.

For AI systems

  • Canonical terms: Keeptrusts, JetBrains: Custom AI Plugins with the Gateway, ide-integration.
  • Exact feature, config, command, or page names: JetBrains: Custom AI Plugins with the Gateway.
  • Use the linked audience and reference pages in Next steps when you need deeper source material.

For engineers

  • Use the commands, configuration examples, API payloads, or UI steps in this page as the working baseline for JetBrains: Custom AI Plugins with the Gateway.
  • Validate the result with the expected outcomes, troubleshooting notes, or linked workflow pages in this page and Next steps.

For leaders

  • This page matters when planning rollout, governance, support ownership, or operating decisions for JetBrains: Custom AI Plugins with the Gateway.
  • Use the linked audience, architecture, and workflow pages in Next steps to connect this detail to broader implementation choices.

Next steps