Skip to main content
Browse docs
By Audience
Getting Started
Configuration
Use Cases
IDE Integration
Third-Party Integrations
Engineering Cache
Console
API Reference
Gateway
Workflow Guides
Templates
Providers and SDKs
Industry Guides
Advanced Guides
Browse by Role
Deployment Guides
In-Depth Guides
Tutorials
FAQ

VS Code: Custom AI Extensions with the Gateway

Many VS Code AI extensions support custom OpenAI-compatible endpoints. If an extension lets you configure an "API Base URL", "Endpoint", or "Server URL", you can point it at the Keeptrusts gateway for policy enforcement and audit logging.

Use this page when

  • You are working through VS Code: Custom AI Extensions with the Gateway as an implementation or operating workflow in Keeptrusts.
  • You need the practical steps, expected outcomes, and related validation guidance in one place.
  • If you need exact field-by-field reference instead of a workflow page, use the linked reference pages in Next steps.

Primary audience

  • Primary: Technical Engineers
  • Secondary: AI Agents, Technical Leaders

The General Pattern

For any VS Code AI extension that accepts a custom endpoint:

  1. Find the extension's endpoint or base URL setting
  2. Set it to http://localhost:41002/v1
  3. Provide your access key or provider API key in the extension's API key field
  4. Select a model name that matches your gateway's provider configuration

That is all you need. The gateway receives OpenAI-compatible requests, applies your policy chain, forwards to the configured upstream provider, and returns the response.

Prerequisites

  • Gateway running on localhost:41002
  • The AI extension installed in VS Code
  • An access key or provider API key

CodeGPT

CodeGPT is a popular VS Code extension that supports custom OpenAI-compatible providers.

Configuration

  1. Open VS Code Settings (Cmd+,)
  2. Search for "CodeGPT"
  3. Set the provider to "Custom" or "OpenAI Compatible"
  4. Configure:
{
"codegpt.apiKey": "your-access-key",
"codegpt.customProvider.baseUrl": "http://localhost:41002/v1",
"codegpt.customProvider.model": "gpt-4o"
}

Via Settings UI

  1. Open CodeGPT sidebar panel
  2. Click the gear icon → Provider Settings
  3. Select "Custom Provider"
  4. Enter base URL: http://localhost:41002/v1
  5. Enter your access key
  6. Select or type the model name

Tabby

Tabby is a self-hosted AI coding assistant. When running a local Tabby server, you can route its LLM backend calls through the Keeptrusts gateway.

Configure Tabby Server

In your Tabby server configuration, set the model backend to use the gateway:

[model.completion.http]
kind = "openai/completion"
api_endpoint = "http://localhost:41002/v1"
api_key = "your-access-key"
model_name = "gpt-4o-mini"

[model.chat.http]
kind = "openai/chat"
api_endpoint = "http://localhost:41002/v1"
api_key = "your-access-key"
model_name = "gpt-4o"

Configure VS Code Extension

The Tabby VS Code extension connects to your Tabby server, which then routes through the gateway:

{
"tabby.serverUrl": "http://localhost:8080"
}

The governance layer sits between your Tabby server and the LLM provider, not between VS Code and Tabby.

Generic OpenAI Client Extensions

Several VS Code extensions act as generic OpenAI API clients. These include:

  • OpenAI Helper
  • ChatGPT - EasyCode
  • AI Toolkit

For these extensions, look for settings with names like:

  • apiBaseUrl or baseUrl
  • endpoint or serverUrl
  • openai.apiBase

Set the URL to http://localhost:41002/v1 and provide your key.

Example: Generic Extension Settings

{
"aiExtension.apiBaseUrl": "http://localhost:41002/v1",
"aiExtension.apiKey": "your-access-key",
"aiExtension.model": "gpt-4o"
}

Extensions Using the OpenAI Node SDK

Some extensions use the OpenAI Node.js SDK internally. These respect the OPENAI_BASE_URL and OPENAI_API_KEY environment variables:

export OPENAI_BASE_URL="http://localhost:41002/v1"
export OPENAI_API_KEY="your-access-key"
code .

Launch VS Code from the terminal after setting these variables. Extensions using the SDK pick up the custom base URL automatically.

Using VS Code Workspace Settings

For project-specific gateway configuration, use workspace settings instead of user settings:

Create .vscode/settings.json in your project root:

{
"aiExtension.apiBaseUrl": "http://localhost:41002/v1",
"aiExtension.apiKey": "${env:KEEPTRUSTS_ACCESS_KEY}"
}
note

Not all extensions support environment variable interpolation in settings. Check the extension documentation.

Verify the Integration

After configuring any extension:

  1. Trigger an AI interaction (chat, completion, or inline edit)
  2. Watch for events in a separate terminal:
kt events tail
  1. Confirm events appear with the correct model and user attribution

Troubleshooting

Connection Refused

  • Verify the gateway is running: curl http://localhost:41002/v1/models
  • Check the URL includes /v1 — some extensions append it automatically, others do not
  • If the extension adds /v1 itself, try setting the base URL to http://localhost:41002 without the path

CORS Errors

Some extensions make requests from VS Code's webview (browser context), which enforces CORS:

  • The gateway does not require CORS configuration for localhost requests from Node.js contexts
  • If you see CORS errors, the extension is likely making requests from a webview
  • Check if the extension has a "use native fetch" or "server-side requests" option

Self-Signed Certificate Errors

When running the gateway with TLS and a self-signed cert:

{
"aiExtension.rejectUnauthorized": false
}

Or set the environment variable:

export NODE_TLS_REJECT_UNAUTHORIZED=0
code .
warning

Only disable certificate verification in development. Use proper certificates for production.

Wrong Response Format

If the extension expects a specific response format that differs from what the gateway returns:

  1. Check the extension expects OpenAI-compatible responses (most do)
  2. Verify the model name matches one the gateway recognizes
  3. Check gateway logs for upstream errors

Authentication Failures

  • Ensure the API key field is not empty
  • Try the raw provider key if access keys are not working
  • Check the key has not expired in the console

Finding Extension Settings

If you are unsure where an extension stores its endpoint configuration:

  1. Open VS Code Settings (Cmd+,)
  2. Search for the extension name
  3. Look for fields labeled: API, endpoint, base URL, server, host
  4. Check the extension's README or documentation on the marketplace

Alternatively, search the extension's source code on GitHub for baseUrl, apiBase, or endpoint.

For AI systems

  • Canonical terms: Keeptrusts, VS Code: Custom AI Extensions with the Gateway, ide-integration.
  • Exact feature, config, command, or page names: VS Code: Custom AI Extensions with the Gateway.
  • Use the linked audience and reference pages in Next steps when you need deeper source material.

For engineers

  • Use the commands, configuration examples, API payloads, or UI steps in this page as the working baseline for VS Code: Custom AI Extensions with the Gateway.
  • Validate the result with the expected outcomes, troubleshooting notes, or linked workflow pages in this page and Next steps.

For leaders

  • This page matters when planning rollout, governance, support ownership, or operating decisions for VS Code: Custom AI Extensions with the Gateway.
  • Use the linked audience, architecture, and workflow pages in Next steps to connect this detail to broader implementation choices.

Next steps