Getting started
Getting started
Section titled “Getting started”handover scans a codebase and produces 14 interconnected markdown documents that explain the project end-to-end. This guide takes you from install to first output.
Prerequisites
Section titled “Prerequisites”- Node.js >= 18 — check with
node --version - An LLM API key — Anthropic is the default; see providers for alternatives. Ollama works without an API key.
Install
Section titled “Install”Zero-install (recommended)
Section titled “Zero-install (recommended)”npx handover-cli generateNo global install required. npm downloads and runs the latest version on each call.
Global install
Section titled “Global install”npm install -g handover-clihandover generateProject dependency
Section titled “Project dependency”npm install --save-dev handover-clinpx handover generateFirst run
Section titled “First run”1. Set your API key:
export ANTHROPIC_API_KEY=sk-ant-...2. Run in any project directory:
npx handover-cli generateThat’s all that’s required. handover uses sensible defaults and needs no config file.
3. View output:
Output lands in ./handover/ by default:
handover/ 00-INDEX.md 01-PROJECT-OVERVIEW.md 02-GETTING-STARTED.md 03-ARCHITECTURE.md 04-FILE-STRUCTURE.md 05-FEATURES.md 06-MODULES.md 07-DEPENDENCIES.md 08-ENVIRONMENT.md 09-EDGE-CASES-AND-GOTCHAS.md 10-TECH-DEBT-AND-TODOS.md 11-CONVENTIONS.md 12-TESTING-STRATEGY.md 13-DEPLOYMENT.mdExample: what output looks like
Section titled “Example: what output looks like”The opening of 01-PROJECT-OVERVIEW.md in a typical project:
---title: Project OverviewdocumentId: 01-project-overviewstatus: complete---
# Project Overview
my-app is a Node.js REST API for managing customer orders. Built with TypeScript,it exposes a GraphQL interface backed by PostgreSQL and is deployed to AWS Lambda.
## What This Project Does
Provides order lifecycle management — creation, payment, fulfilment, and returns —via a GraphQL API consumed by the company's mobile and web clients.Common options
Section titled “Common options”| Flag | Description |
|---|---|
--provider <name> | LLM provider (anthropic, openai, ollama, …) |
--model <name> | Model name override |
--static-only | Static analysis only — no AI calls, no cost, no API key required |
--audience <mode> | human (default) or ai for RAG-optimized output |
--only <aliases> | Generate specific documents only (comma-separated, e.g. overview,arch) |
Preview cost before running
Section titled “Preview cost before running”npx handover-cli estimateFree static-only run (no API key needed)
Section titled “Free static-only run (no API key needed)”npx handover-cli generate --static-onlyStatic-only mode runs all file-tree, dependency, git-history, and AST analysis without any AI calls. Documents are generated with the available static data; AI-enriched sections are noted as unavailable.
Minimal config file
Section titled “Minimal config file”Create .handover.yml in your project root to customize behavior. The minimal useful config:
provider: anthropicoutput: docs/handover
project: name: My Project description: 'A brief description of what this project does'Run the same command after creating the file — handover picks it up automatically:
npx handover-cli generateNext steps
Section titled “Next steps”- configuration — all 21 config keys with types, defaults, and valid values
- providers — compare all 8 supported LLM providers
- output-documents — understand all 14 generated documents before running