doc
Doc Skill
YOU MUST EXECUTE THIS WORKFLOW. Do not just describe it.
Generate and validate documentation for any project.
Execution Steps
Given /doc [command] [target]:
Step 1: Detect Project Type
# Check for indicators
ls package.json pyproject.toml go.mod Cargo.toml 2>/dev/null
# Check for existing docs
ls -d docs/ doc/ documentation/ 2>/dev/null
Classify as:
- CODING: Has source code, needs API docs
- INFORMATIONAL: Primarily documentation (wiki, knowledge base)
- OPS: Infrastructure, deployment, runbooks
Step 2: Execute Command
discover - Find undocumented features:
# Find public functions without docstrings (Python)
grep -r "^def " --include="*.py" | grep -v '"""' | head -20
# Find exported functions without comments (Go)
grep -r "^func [A-Z]" --include="*.go" | head -20
coverage - Check documentation coverage:
# Count documented vs undocumented
TOTAL=$(grep -r "^def \|^func \|^class " --include="*.py" --include="*.go" | wc -l)
DOCUMENTED=$(grep -r '"""' --include="*.py" | wc -l)
echo "Coverage: $DOCUMENTED / $TOTAL"
gen [feature] - Generate documentation:
- Read the code for the feature
- Understand what it does
- Generate appropriate documentation
- Write to docs/ directory
all - Update all documentation:
- Run discover to find gaps
- Generate docs for each undocumented feature
- Validate existing docs are current
Step 3: Generate Documentation
When generating docs, include:
For Functions/Methods:
## function_name
**Purpose:** What it does
**Parameters:**
- `param1` (type): Description
- `param2` (type): Description
**Returns:** What it returns
**Example:**
```python
result = function_name(arg1, arg2)
Notes: Any important caveats
**For Classes:**
```markdown
## ClassName
**Purpose:** What this class represents
**Attributes:**
- `attr1`: Description
- `attr2`: Description
**Methods:**
- `method1()`: What it does
- `method2()`: What it does
**Usage:**
```python
obj = ClassName()
obj.method1()
### Step 4: Create Code-Map (if requested)
**Write to:** `docs/code-map/`
```markdown
# Code Map: <Project>
## Overview
<High-level architecture>
## Directory Structure
src/ ├── module1/ # Purpose ├── module2/ # Purpose └── utils/ # Shared utilities
## Key Components
### Module 1
- **Purpose:** What it does
- **Entry point:** `main.py`
- **Key files:** `handler.py`, `models.py`
### Module 2
...
## Data Flow
<How data moves through the system>
## Dependencies
<External dependencies and why>
Step 5: Validate Documentation
Check for:
- Out-of-date docs (code changed, docs didn't)
- Missing sections (no examples, no parameters)
- Broken links
- Inconsistent formatting
Step 6: Write Report
Write to: .agents/doc/YYYY-MM-DD-<target>.md
# Documentation Report: <Target>
**Date:** YYYY-MM-DD
**Project Type:** <CODING/INFORMATIONAL/OPS>
## Coverage
- Total documentable items: <count>
- Documented: <count>
- Coverage: <percentage>%
## Generated
- <list of docs generated>
## Gaps Found
- <undocumented item 1>
- <undocumented item 2>
## Validation Issues
- <issue 1>
- <issue 2>
## Next Steps
- [ ] Document remaining gaps
- [ ] Fix validation issues
Step 7: Report to User
Tell the user:
- Documentation coverage percentage
- Docs generated/updated
- Gaps remaining
- Location of report
Key Rules
- Detect project type first - approach varies
- Generate meaningful docs - not just stubs
- Include examples - always show usage
- Validate existing - docs can go stale
- Write the report - track coverage over time
Commands Summary
| Command | Action |
|---|---|
discover |
Find undocumented features |
coverage |
Check documentation coverage |
gen [feature] |
Generate docs for specific feature |
all |
Update all documentation |
validate |
Check docs match code |
Examples
Generating API Documentation
User says: /doc gen authentication
What happens:
- Agent detects project type by checking for
package.jsonand finding Node.js project - Agent searches codebase for authentication-related functions using grep
- Agent reads authentication module files to understand implementation
- Agent generates documentation with purpose, parameters, returns, and usage examples
- Agent writes to
docs/api/authentication.mdwith code samples - Agent validates generated docs match actual function signatures
Result: Complete API documentation created for authentication module with working code examples.
Checking Documentation Coverage
User says: /doc coverage
What happens:
- Agent detects Python project from
pyproject.toml - Agent counts total functions/classes with
grep -r "^def \|^class " - Agent counts documented items by searching for docstrings (
""") - Agent calculates coverage: 45/67 items = 67% coverage
- Agent writes report to
.agents/doc/2026-02-13-coverage.md - Agent lists 22 undocumented functions as gaps
Result: Documentation coverage report shows 67% coverage with specific list of 22 functions needing docs.
Troubleshooting
| Problem | Cause | Solution |
|---|---|---|
| Coverage calculation inaccurate | Grep pattern doesn't match all code styles | Adjust pattern for project conventions. For Python, check for async def and class methods. For Go, check both func and type definitions. |
| Generated docs lack examples | Missing context about typical usage | Read existing tests to find usage patterns. Check README for code samples. Ask user for typical use case if unclear. |
| Discover command finds too many items | Low existing documentation coverage | Prioritize by running discover on specific subdirectories. Focus on public API first, internal utilities later. Use --limit to process in batches. |
| Validation shows docs out of sync | Code changed after docs written | Re-run gen command for affected features. Consider adding git hook to flag doc updates needed when code changes. |