Skip to content

Configuration reference

handover supports three configuration sources. Values are merged in this precedence order (highest to lowest):

  1. CLI flags--provider openai, --model gpt-4o, etc.
  2. Environment variablesHANDOVER_PROVIDER, HANDOVER_MODEL, HANDOVER_OUTPUT
  3. .handover.yml — config file in the project root
  4. Zod defaults — built-in defaults; most keys have sensible values

The authoritative schema is in src/config/schema.ts. The loading logic and precedence implementation are in src/config/loader.ts.

Typestring (enum)
Defaultanthropic
Valid valuesanthropic, openai, ollama, groq, together, deepseek, azure-openai, custom
Env overrideHANDOVER_PROVIDER

The LLM provider to use for AI analysis rounds. Each named provider comes with preset defaults (base URL, default model, API key env var). See providers for a full comparison.

provider: openai

Typestring
DefaultProvider default (e.g. claude-opus-4-6 for Anthropic)
Env overrideHANDOVER_MODEL

The model name to use. When omitted, the provider preset’s default model is used. Specify a model if you want a faster or cheaper alternative, or to pin a specific version.

model: claude-sonnet-4-5

Typestring
DefaultProvider default (e.g. ANTHROPIC_API_KEY)

The name of the environment variable that holds the API key. Useful when you have the key stored under a non-standard name, or when using the custom provider. API keys are never read from the config file — they are always resolved from the environment at runtime.

apiKeyEnv: MY_CUSTOM_LLM_KEY

Typestring (URL)
DefaultProvider default

Override the API endpoint URL. Required for azure-openai (which has no shared endpoint) and custom. Also useful for routing traffic through a proxy or self-hosted gateway.

baseUrl: https://my-gateway.internal/v1

Typenumber (integer, milliseconds)
DefaultProvider default (120000 for cloud providers, 300000 for Ollama)

Per-request timeout in milliseconds. Increase for slow models or large projects. Cloud providers default to 2 minutes; Ollama defaults to 5 minutes.

timeout: 180000

Typestring (path)
Default./handover
Env overrideHANDOVER_OUTPUT

Directory where handover writes its 14 output documents. Relative paths are resolved from the current working directory. The directory is created if it doesn’t exist.

output: docs/handover

Typestring (enum)
Defaulthuman
Valid valueshuman, ai

Controls output formatting. human produces readable prose with narrative flow. ai produces structured output with YAML front-matter blocks and explicit section headers — optimized for ingestion by AI coding tools and RAG pipelines.

audience: ai

Typestring[] (glob patterns)
Default["**/*"]

Glob patterns specifying which files to include in the analysis. By default all files are included (subject to exclude). Use this to narrow analysis to a subdirectory or file type.

include:
- 'src/**'
- '*.ts'

Typestring[] (glob patterns)
Default[]

Glob patterns for files to exclude from analysis. Applied after include. Common uses: exclude generated code, legacy directories, or large binary assets.

exclude:
- '**/*.generated.ts'
- 'legacy/**'
- 'dist/**'

Typestring
Default(none)

Additional free-text context injected into AI prompts. Use this to tell the AI about domain-specific constraints, conventions, or project goals that aren’t obvious from the code alone.

context: |
This is an internal tool used by the payments team.
All financial calculations must be in integer cents, never floats.

Typenumber (USD)
Default(none — no warning)

If set, handover warns before proceeding when the estimated cost exceeds this value. Useful as a guard against accidentally running expensive analysis on large repos.

costWarningThreshold: 2.00

Provide metadata about your project. These values are injected into AI prompts and appear in the generated documents. All fields are optional.

Typestring
Default(inferred from package.json or directory name)

The project’s display name. Used in document headings and introductions.

project:
name: Order Management Service

Typestring
Default(none)

A one-sentence description of what the project does. Injected into AI prompts to improve context quality.

project:
description: 'REST API for managing customer orders from placement to fulfilment'

Typestring
Default(none)

The business or technical domain (e.g. e-commerce, fintech, devtools). Helps the AI frame its analysis appropriately.

project:
domain: fintech

Typestring
Default(none)

Team size context (e.g. 1, 5, 50+). The AI uses this to calibrate the level of detail in documentation and onboarding guidance.

project:
teamSize: '8'

Typestring
Default(none)

Where the project is deployed (e.g. AWS Lambda, Kubernetes, Vercel, bare metal). Used in the Deployment document.

project:
deployTarget: AWS ECS Fargate

Typenumber (positive integer)
Default4

Maximum number of concurrent LLM API calls during AI analysis rounds. Reduce to 1 for providers with strict rate limits. Ollama defaults to 1 automatically.

analysis:
concurrency: 2

Typeboolean
Defaultfalse

When true, skip all AI analysis rounds. Only static analyzers run. Documents are generated with the available static data; AI-enriched sections are marked as unavailable. Equivalent to the --static-only CLI flag. Free — no API key required.

analysis:
staticOnly: true

These options control which files are prioritized when packing the source code into the AI context window.

Typenumber (positive integer)
Default(provider preset — e.g. 200000 for Anthropic)

Override the token budget for the context window. Reduce to lower cost; increase if your provider supports a larger window than the preset default.

contextWindow:
maxTokens: 100000

Typestring[] (glob patterns)
Default[]

Files matching these globs are always included in the AI context window, regardless of scoring. Use for core files that must be present for accurate analysis.

contextWindow:
pin:
- 'src/core/**'
- 'src/types/**'

Typestring[] (glob patterns)
Default[]

Files matching these globs receive a higher priority score during context packing. They are included before lower-priority files when the budget is tight, but not guaranteed to be included (unlike pin).

contextWindow:
boost:
- 'src/api/**'
- 'src/services/**'

# Provider and model
provider: anthropic
model: claude-sonnet-4-5
output: docs/handover
audience: human
# File selection
include:
- '**/*'
exclude:
- 'dist/**'
- '**/*.generated.ts'
- 'node_modules/**'
# Additional context for AI prompts
context: |
Internal tooling project. Audience is senior engineers familiar with TypeScript.
# Cost guard
costWarningThreshold: 3.00
# Project metadata
project:
name: My Project
description: 'A brief description of what this project does'
domain: devtools
teamSize: '5'
deployTarget: GitHub Actions + npm
# Analysis tuning
analysis:
concurrency: 4
staticOnly: false
# Context window
contextWindow:
maxTokens: 150000
pin:
- 'src/core/**'
boost:
- 'src/api/**'