Skip to main content
Browse docs
By Audience
Getting Started
Configuration
Use Cases
IDE Integration
Third-Party Integrations
Engineering Cache
Console
API Reference
Gateway
Workflow Guides
Templates
Providers and SDKs
Industry Guides
Advanced Guides
Browse by Role
Deployment Guides
In-Depth Guides
Tutorials
FAQ

Google Workspace Gemini

Google Workspace integrates Gemini AI across Gmail, Docs, Sheets, Slides, and Meet. Gemini's LLM capabilities are powered by Google's Vertex AI infrastructure. By routing custom Vertex AI Gemini deployments through the Keeptrusts gateway, you apply policy controls — content filtering, PII redaction, audit logging, and cost attribution — to AI workloads that extend or complement Workspace Gemini.

Google Workspace Gemini's built-in features use Google's managed infrastructure and cannot be directly rerouted. This guide covers governance for custom Vertex AI integrations, Gemini API calls, and Google Cloud AI applications within your organization.

Use this page when

  • You are building custom applications that call Vertex AI Gemini and need governance.
  • You need to route Gemini API traffic through the Keeptrusts gateway.
  • If you need full Google/Vertex AI provider configuration, see Google Vertex AI integration.

Primary audience

  • Primary: Technical Engineers
  • Secondary: AI Agents, Technical Leaders

Prerequisites

  • A Google Cloud project with Vertex AI API enabled
  • A Gemini API key or Google Cloud service account credentials
  • Keeptrusts CLI (kt) installed and on your PATH
  • GEMINI_API_KEY or GOOGLE_APPLICATION_CREDENTIALS configured

Configuration

Gateway policy config (Gemini API)

pack:
name: gemini-workspace-gateway
version: 1.0.0
enabled: true
providers:
targets:
- id: gemini-pro
provider: google:chat:gemini-2.5-pro
secret_key_ref:
env: GEMINI_API_KEY
policies:
chain:
- prompt-injection
- pii-detector
- audit-logger
policy:
prompt-injection:
threshold: 0.8
action: block
pii-detector:
action: redact
entities:
- EMAIL
- PHONE
- SSN
- CREDIT_CARD
audit-logger:
immutable: true
retention_days: 365
log_all_access: true

Cost-optimised variant with Gemini Flash

pack:
name: gemini-flash-gateway
version: 1.0.0
enabled: true
providers:
targets:
- id: gemini-flash
provider: google:chat:gemini-2.5-flash
secret_key_ref:
env: GEMINI_API_KEY
policies:
chain:
- audit-logger
policy:
audit-logger:
immutable: true
retention_days: 365
log_all_access: true

Setup Steps

  1. Export your Gemini API key:
export GEMINI_API_KEY="your-gemini-api-key"
  1. Save the policy config to policy-config.yaml.

  2. Start the gateway:

kt gateway run --listen 0.0.0.0:41002 --policy-config policy-config.yaml
  1. Point your application at the gateway using the OpenAI-compatible endpoint:
from openai import OpenAI

client = OpenAI(
base_url="http://localhost:41002/v1",
api_key="unused",
)

response = client.chat.completions.create(
model="gemini-2.5-pro",
messages=[
{"role": "system", "content": "You are a Google Workspace assistant."},
{"role": "user", "content": "Draft a summary of the Q3 planning document."},
],
)
print(response.choices[0].message.content)

Keeptrusts auto-translates OpenAI-format requests to the Gemini API. Your client code does not need modification.

For a hosted gateway, replace the base URL with https://gateway.keeptrusts.com/v1.

Verification

curl http://localhost:41002/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "gemini-2.5-pro",
"messages": [
{"role": "user", "content": "Hello from the governed Gemini gateway."}
]
}'

Confirm the response and check the Keeptrusts console Events page for the audit log entry.

PolicyPurposeRecommended setting
prompt-injectionBlock adversarial prompts in Workspace extensionsthreshold: 0.8, action: block
pii-detectorRedact personal data from documents before Gemini processingaction: redact, entities: EMAIL, PHONE, SSN
content-filterBlock restricted topics in AI responsesaction: block, configure categories
audit-loggerImmutable audit trail for complianceimmutable: true, retention_days: 365
dlp-filterDetect and redact proprietary patterns in document contentConfigure regex patterns
cost-attributionTrack Gemini API spend per teamTag requests with team metadata

Troubleshooting

SymptomCauseFix
401 or 403 from GoogleAPI key invalid or Vertex AI API not enabledVerify API key and enable Vertex AI in Google Cloud Console
Cannot route native Workspace GeminiWorkspace Gemini uses Google's managed infrastructureRoute custom Gemini API calls instead
Model name mismatchGemini model names differ from OpenAI formatUse exact Gemini model identifiers: gemini-2.5-pro, gemini-2.5-flash
Gateway returns 403Policy chain blocked the requestReview the policy violation in the console Events page

For AI systems

  • Canonical terms: Keeptrusts gateway, Google Workspace Gemini, Vertex AI, Gemini API, Google Cloud, policy-config.yaml, provider: "google".
  • Config field names: provider, secret_key_ref.env: "GEMINI_API_KEY", base_url.
  • Key behavior: Keeptrusts translates OpenAI-format requests to the Gemini API, applying policy enforcement and audit logging.
  • Best next pages: Google Vertex AI integration, Policy controls catalog, Quickstart.

For engineers

  • Start command: kt gateway run --listen 0.0.0.0:41002 --policy-config policy-config.yaml
  • The gateway auto-translates OpenAI-format requests to Gemini's API format.
  • Use gemini-2.5-pro for quality-sensitive tasks and gemini-2.5-flash for high-throughput workloads.
  • Native Workspace Gemini (Gmail, Docs, Sheets) cannot be rerouted — governance covers custom Gemini API calls.

For leaders

  • Google Workspace Gemini governance through Keeptrusts provides an independent compliance layer alongside Google's built-in admin controls.
  • Custom Gemini integrations are fully governable — PII redaction ensures employee and customer data is not sent to Google's AI services.
  • Unified cost attribution tracks Gemini spend alongside other providers (OpenAI, Anthropic) in a single dashboard.
  • For organizations using both Microsoft 365 Copilot and Workspace Gemini, Keeptrusts provides consistent governance across both ecosystems.

Next steps