Skip to main content
Browse docs
By Audience
Getting Started
Configuration
Use Cases
IDE Integration
Third-Party Integrations
Engineering Cache
Console
API Reference
Gateway
Workflow Guides
Templates
Providers and SDKs
Industry Guides
Advanced Guides
Browse by Role
Deployment Guides
In-Depth Guides
Tutorials
FAQ

Test Maps: Linking Tests to Source Code

The test_map artifact creates a bidirectional mapping between test files and the source code they exercise. AI uses this mapping to instantly find relevant tests for any file, suggest missing test coverage, and identify untested code paths.

Use this page when

  • You are configuring test_map generation and want to understand the mapping signals and coverage gap detection.
  • AI assistants cannot identify which tests cover a given source file.
  • You want pre-commit guidance on missing test coverage without manually scanning test directories.

Primary audience

  • Primary: AI Agents, Technical Engineers
  • Secondary: Technical Leaders

What a Test Map Contains

A test_map artifact captures the relationship between test and source files:

Test-to-Source Mappings

For each test file, the map records:

  • The source file(s) the test exercises
  • The confidence level of the mapping (explicit import, path convention, metadata)
  • The test framework and runner configuration
  • The test tier (unit, component, integration, end-to-end)

Source-to-Test Mappings

For each source file, the map records:

  • All test files that reference or exercise it
  • Coverage density (how many tests target this file)
  • Test tier distribution (proportion of unit vs. integration tests)
  • Gaps where no test file references the source

Path Convention Rules

The patterns used to infer test-source relationships:

  • Mirrored directory structures (src/utils/auth.tstests/unit/utils/auth.test.ts)
  • Suffix conventions (.test.ts, .spec.ts, _test.rs, _test.go)
  • Test directory naming (__tests__/, tests/, spec/)
  • Framework-specific conventions (Vitest, Jest, Playwright, cargo test)

Test Metadata

Additional context about the test suite:

  • Test names and describe blocks
  • Setup and teardown dependencies
  • Shared fixtures and mock definitions
  • Serial vs. parallel execution requirements

How Test Maps Are Built

Context Fabric builds test maps from multiple signals:

Path Conventions

The most common mapping method. The pipeline identifies test files by path patterns and maps them to source files using directory mirroring and naming conventions:

src/lib/auth/session.ts → tests/unit/lib/auth/session.test.ts
src/components/Button.tsx → tests/component/Button.test.tsx
src/app/api/events/route.ts → tests/integration/api/events.test.ts

Import Analysis

The pipeline traces imports within test files to identify which source modules are under test:

  • Direct imports of the module under test
  • Imports of shared fixtures that target specific modules
  • Re-exports that bridge test helpers to source code

Test Metadata

Some test frameworks provide explicit metadata:

  • @covers annotations or docblock references
  • Test file headers that declare the target module
  • Configuration files that map test suites to source directories

Coverage Reports

When available, existing coverage reports provide ground truth for test-source mappings. The pipeline ingests coverage data to validate and enrich convention-based mappings.

How Test Maps Improve AI Interactions

Finding Relevant Tests

When an engineer modifies a file and asks "what tests cover this?", the AI instantly responds with the mapped test files. No filesystem scanning required.

Suggesting Test Additions

When the AI sees a source file with no mapped tests or low coverage density, it suggests test additions. The test map provides the naming convention and directory location for new test files.

Identifying Untested Code

The test map reveals source files with no corresponding tests. Engineers can prioritize testing efforts based on the gap analysis.

Understanding Test Context

When debugging a test failure, the AI references the test map to understand which source file is under test and what other tests exercise the same code.

Shared Across Your Organization

Test maps are highly shareable because test-source relationships change slowly. Adding new tests or new source files updates the map, but the bulk of existing mappings remain stable.

When any engineer asks about test coverage, they receive answers from the same cached map. This consistency means the entire team works from the same understanding of test organization.

Token Savings

Without a test map, the AI needs to:

  1. List all test directories
  2. Read test file imports to find relevant ones
  3. Cross-reference path conventions manually

This process costs 5,000–20,000 tokens per interaction. A cached test map delivers the same information in 500–1,500 tokens.

When Test Maps Regenerate

Regeneration triggers include:

  • New test files added to the repository
  • New source files added without corresponding tests
  • Test file renames or directory restructuring
  • Changes to test framework configuration

Modifications to existing test content (adding assertions, updating fixtures) do not trigger regeneration unless they change import relationships.

Configuration

You configure test_map generation in your repository settings:

  • Test directories — Explicitly specify where tests live if conventions vary
  • Path conventions — Define custom source-to-test path mappings
  • Tier classification — Map directories to test tiers (unit, component, integration, e2e)
  • Exclusion patterns — Skip fixture files, test utilities, or generated tests
  • Coverage integration — Optionally ingest coverage reports for validation

Use Cases

Pre-Commit Guidance

Before committing, an engineer asks "did I forget any tests?" The AI checks the test map for unmapped source changes and suggests specific test files to add.

Code Review

During review, the AI flags modified source files that lack test map entries, prompting reviewers to request test coverage.

Refactoring Safety

When renaming or moving files, the AI uses the test map to identify all tests that need corresponding path updates.

Next steps

For AI systems

  • Canonical terms: Keeptrusts, Codebase Context Fabric, test_map artifact, test-to-source mapping, coverage gaps, path conventions, test tier classification.
  • Config/feature names: test_map, test directories, path conventions, tier classification, exclusion patterns, coverage integration.
  • Best next pages: API Inventories, Symbol Indexes, What is Codebase Context Fabric?.

For engineers

  • Configure test directories explicitly if your project uses non-standard test locations.
  • Define path conventions for custom source-to-test mappings (e.g., src/lib/auth.tstests/unit/lib/auth.test.ts).
  • Set tier classification to categorize directories as unit, component, integration, or e2e.
  • Optionally enable coverage integration to validate convention-based mappings against actual coverage reports.
  • Regeneration triggers: new test/source files added, test file renames, framework config changes. Content-only changes to existing tests do not trigger rebuilds.
  • Token savings: 500–1,500 tokens per test-map lookup vs 5,000–20,000 tokens for manual test discovery.

For leaders

  • Test maps enable automated coverage gap detection — AI flags untested source changes during code review without manual oversight.
  • Reduces time-to-feedback for PR authors by instantly identifying missing tests.
  • Shared across the org: one test map serves all engineers, ensuring consistent understanding of test organization.
  • Low maintenance cost — test-source relationships change slowly; the bulk of mappings remain stable across sprints.