VS Code: Sourcegraph Cody with the Gateway
Sourcegraph Cody is an AI coding assistant that provides chat, completions, and code navigation powered by LLMs. You can route Cody's LLM traffic through the Keeptrusts gateway to apply governance policies, log interactions, and enforce cost controls.
Use this page when
- You are working through VS Code: Sourcegraph Cody with the Gateway as an implementation or operating workflow in Keeptrusts.
- You need the practical steps, expected outcomes, and related validation guidance in one place.
- If you need exact field-by-field reference instead of a workflow page, use the linked reference pages in Next steps.
Primary audience
- Primary: Technical Engineers
- Secondary: AI Agents, Technical Leaders
Integration Approaches
Cody's integration with the Keeptrusts gateway depends on your Cody deployment model:
| Cody Setup | Integration Method |
|---|---|
| Cody Enterprise (self-hosted Sourcegraph) | Configure Sourcegraph server to use gateway as LLM backend |
| Cody Enterprise (cloud) | HTTP proxy interception |
| Cody Free/Pro | HTTP proxy interception |
For enterprise self-hosted deployments, the cleanest approach is configuring the Sourcegraph server's LLM completions endpoint to route through a centrally deployed Keeptrusts gateway. For individual developer setups, proxy interception works with all Cody tiers.
Prerequisites
- Gateway running on
localhost:41002 - Sourcegraph Cody extension installed in VS Code
- Active Cody account (Free, Pro, or Enterprise)
Method 1: HTTP Proxy Interception
Route Cody's traffic through the gateway by configuring VS Code's proxy settings.
Configure VS Code Proxy
Open settings.json (Cmd+Shift+P → "Preferences: Open User Settings (JSON)"):
{
"http.proxy": "http://localhost:41002",
"http.proxyStrictSSL": false,
"http.proxySupport": "on"
}
This routes all VS Code HTTP traffic, including Cody's LLM requests, through the Keeptrusts gateway.
Environment Variable Approach
Alternatively, set proxy environment variables before launching VS Code:
export HTTP_PROXY="http://localhost:41002"
export HTTPS_PROXY="http://localhost:41002"
code .
Add these to your shell profile for persistence:
# ~/.zshrc or ~/.bashrc
export HTTP_PROXY="http://localhost:41002"
export HTTPS_PROXY="http://localhost:41002"
Method 2: Cody Enterprise Custom LLM Endpoint
If you run a self-hosted Sourcegraph instance, configure it to use the Keeptrusts gateway as the LLM completions backend.
In your Sourcegraph site configuration:
{
"completions": {
"provider": "openai",
"endpoint": "http://keeptrusts-gateway:41002/v1",
"chatModel": "gpt-4o",
"completionModel": "gpt-4o-mini",
"accessToken": "your-access-key"
}
}
This approach routes all Cody traffic for your organization through the gateway at the server level, without requiring individual developer proxy configuration.
Configure the Gateway for Cody
Ensure your policy-config.yaml supports Cody's traffic patterns:
pack:
name: vscode-cody-providers-1
version: 1.0.0
enabled: true
providers:
targets:
- id: sourcegraph
provider:
- id: openai
provider:
- id: anthropic
provider:
policies:
chain:
- audit-logger
policy:
audit-logger:
immutable: true
retention_days: 365
log_all_access: true
For proxy interception mode, use the passthrough type to forward Cody's requests to Sourcegraph's servers while applying your policy chain.
Cody-Specific VS Code Settings
Cody exposes several settings you can use alongside the gateway:
{
"cody.serverEndpoint": "https://sourcegraph.com",
"cody.proxy": "http://localhost:41002",
"cody.autocomplete.enabled": true
}
The cody.proxy setting availability depends on your Cody extension version. If it is not available, fall back to the VS Code-level proxy settings described in Method 1.
Verify Traffic Is Flowing
Open a terminal and start watching gateway events:
kt events tail
Then trigger Cody activity in VS Code:
- Open a code file
- Start a Cody chat (Cmd+Shift+C or click the Cody icon)
- Ask a code question
- Wait for Cody's response
You should see events in the tail output:
[2024-01-15 11:15:22] INPUT sourcegraph chat/completions user:dev1 PASS
[2024-01-15 11:15:24] OUTPUT sourcegraph chat/completions user:dev1 PASS
Verify Policy Enforcement
Test that policies apply to Cody traffic:
-
Send a chat message containing a fake secret:
How do I use this API key? AKIA1234567890EXAMPLE -
Check the gateway event log for redaction:
kt events tail --filter action=redact -
Verify the secret was stripped before reaching the LLM provider
Limitations and Workarounds
Authentication Model
Cody authenticates with Sourcegraph using its own token system. When using proxy interception, the gateway preserves Cody's auth headers and applies policies without interfering with authentication.
Cody Gateway vs. Keeptrusts Gateway
Sourcegraph operates its own "Cody Gateway" service for routing LLM requests. Do not confuse this with the Keeptrusts gateway. When using proxy interception, the traffic flow is:
Cody Extension → Keeptrusts Gateway (proxy) → Cody Gateway → LLM Provider
Autocomplete Traffic
Cody's autocomplete sends frequent, small requests. All of these route through the proxy and are subject to your policies. If you find autocomplete latency increases:
- Consider exempting autocomplete requests from expensive policies
- Use
policy-config.yamlpath-based rules to apply lighter policies to completion endpoints
Context Fetching
Cody fetches repository context from Sourcegraph before sending LLM requests. This context traffic also passes through the proxy. Your policies can inspect and log this context.
Troubleshooting
Cody stops responding after enabling proxy
- Verify the gateway is running:
curl http://localhost:41002/v1/models - Check VS Code Output panel → "Cody AI by Sourcegraph" for errors
- Ensure
http.proxyStrictSSLisfalseif the gateway uses self-signed certificates
Authentication errors with proxy
- Confirm your Cody token is still valid
- Check that the gateway is not stripping authorization headers in passthrough mode
- Try accessing Sourcegraph directly (disable proxy temporarily) to isolate the issue
No events appear
- Restart VS Code after changing proxy settings
- Trigger a Cody chat interaction (autocomplete may be cached)
- Check gateway logs for incoming connections
High latency on autocomplete
The proxy adds minimal overhead (<10ms). If autocomplete feels slow:
- Check the gateway policy evaluation time in event logs
- Ensure the gateway has low-latency connectivity to Sourcegraph's servers
- Consider lighter policies for high-frequency autocomplete requests
For AI systems
- Canonical terms: Keeptrusts, VS Code: Sourcegraph Cody with the Gateway, ide-integration.
- Exact feature, config, command, or page names: VS Code: Sourcegraph Cody with the Gateway.
- Use the linked audience and reference pages in Next steps when you need deeper source material.
For engineers
- Use the commands, configuration examples, API payloads, or UI steps in this page as the working baseline for VS Code: Sourcegraph Cody with the Gateway.
- Validate the result with the expected outcomes, troubleshooting notes, or linked workflow pages in this page and Next steps.
For leaders
- This page matters when planning rollout, governance, support ownership, or operating decisions for VS Code: Sourcegraph Cody with the Gateway.
- Use the linked audience, architecture, and workflow pages in Next steps to connect this detail to broader implementation choices.
Next steps
- Access Keys and Authentication — understand key management for gateway access
- VS Code: Continue Extension — an alternative assistant with native endpoint support
- IDE Integration Overview — explore other IDE options