wtf.implement-task
Implement Task
Pick up an existing Task as a developer. Core value: reads the full spec (Task + Feature + Epic), maps it to the actual codebase, proposes a concrete technical approach, then drives implementation test-first against each Gherkin scenario.
The expected Task issue body structure is defined in @.github/ISSUE_TEMPLATE/TASK.md.
Process
0. GitHub CLI setup
Run steps 1–2 of ../references/gh-setup.md (install check and auth check). Stop if gh is not installed or not authenticated. Extensions are not required for this skill.
Skip this step if invoked from wtf.verify-task or another skill that already ran gh-setup this session.
1. Identify the Task
Call AskUserQuestion (per ../references/questioning-style.md):
- question: "Which Task are you implementing?"
- header: "Task"
- options: from recent open issues labeled
task
Walk Task → Feature → Epic per ../references/spec-hierarchy.md to extract Gherkin, Contracts, Impacted Areas (Task) and ACs / Goal / constraints (Feature, Epic).
2. Lifecycle check
Apply the absent-label gate from ../references/lifecycle-labels.md for the designed label on the Task — recommended skill wtf.design-task, header Design check. On Design it first → follow wtf.design-task passing the Task number as context. On Skip design → proceed. If present, continue silently.
3. Load the technical steering document
Load docs/steering/TECH.md per the strict consumer-side load in ../references/steering-doc-process.md (recommended skill: wtf.steer-tech). Apply its stack, architecture patterns, key constraints, commands, and ADRs silently throughout this session.
4. Set up the branch
Set up the feature branch and task branch per ../references/branch-setup.md (slug generation, feature-branch create-or-checkout, task-branch create-or-resume). Resolve any conflicts before proceeding.
5. Explore the codebase
Before exploring, identify the test framework setup by reading a sample of existing test files. Record the following in a working scratchpad before proceeding — these govern every test written in step 8:
| Field | Value |
|---|---|
| Test framework | (e.g. Jest, Vitest, pytest, RSpec) |
| Test file pattern | (e.g. **/*.test.ts, tests/test_*.py) |
| Import convention | (e.g. import { describe, it } from 'vitest') |
| Run command | (e.g. npm test, pytest) |
| Coverage command | (e.g. npm run coverage, pytest --cov) |
Use the Agent tool with these concrete searches (run in parallel):
Grepfor the domain nouns and verbs from the Task's Functional Description across*.{ts,tsx,js,jsx,py,go,rb}files — finds files and modules this task will touchGlobmatching the file patterns for each Impacted Area listed in the Task (e.g.src/api/**/*,src/features/<feature-slug>/**/*) — surfaces integration points and existing patternsGrepfor interface or type names from the Task's Contracts section — finds current interface definitions to implement againstGlobmatching the test file pattern from the scratchpad (e.g.**/*.test.ts) near the integration points found above — surfaces existing tests covering adjacent behaviorGrepfor any import of the domain objects or services this task depends on — identifies dependencies that must exist first
Also fetch any relevant wiki pages or in-repo glossary docs for this task's Bounded Context. Use these to ensure the implementation and test naming aligns with the team's Ubiquitous Language.
6. Draft the Technical Approach
Produce a concrete Technical Approach with actual file paths (not generic layer names):
- Architecture decisions: which layer owns what, which patterns to follow
- Data flow: how data moves from input to output
- Trade-offs: what alternatives were considered and why this approach was chosen
- Impacted Areas: concrete file paths for Backend, Frontend, Database, APIs
7. Review approach with user
Show the Technical Approach. Then call AskUserQuestion (per ../references/questioning-style.md):
- question: "Does this align with how you'd approach it?"
- header: "Approach review"
- options:
- Yes — looks good, proceed → continue with implementation
- I have constraints to share → adjust the approach first
- Suggest an alternative → describe a different approach
Apply changes. Then update the Task issue with the Technical Approach and Impacted Areas.
See
references/issue-body-update-pattern.mdfor the read-merge-write pattern. Use/tmp/wtf.implement-task-<task_number>-approach.mdas the temp file.
gh issue edit <task_number> --body-file /tmp/wtf.implement-task-<task_number>-approach.md
8. Drive the TDD cycle
For each Gherkin scenario in the Task, work through them in order. Match the project's established test patterns discovered in step 5. Reference the Contracts & Interfaces section for exact request/response shapes.
-
Write the failing test for the scenario.
-
Implement the minimum code to make it pass.
-
Refactor if needed — keep functions under 40 lines, no deep nesting.
-
Commit — atomic semantic commit per
../references/commit-conventions.md. Use theScenario:andTask:trailers:git add <changed files> git commit -m "<type>(<scope>): <short description> Scenario: <scenario name> Task: #<task_number>" -
Do not skip ahead — each scenario is a checkpoint.
Once all scenarios are green, run the full lint and type-check gate once across all changes. Check package.json for lint, typecheck, type-check, or check script keys and run whichever exist:
# e.g. npm run lint && npm run typecheck
Fix any issues before proceeding to coverage.
9. Verify coverage
Once all scenarios pass, confirm unit test coverage meets the minimum threshold for all new and modified code. Use the threshold specified in docs/steering/QA.md if it exists; default to 80% if the document is absent or does not define a threshold:
# Run the project's coverage command (check package.json scripts)
If coverage is below 80% on any new or modified file, add targeted tests before proceeding. Every public function must have at least one happy-path and one error-path test.
10. Update Test Mapping
Fill the Test Mapping table in the Task issue with concrete file paths:
| Gherkin Scenario | Test file | Status |
|---|---|---|
<scenario name> |
<test file path:line> |
passing |
See
references/issue-body-update-pattern.mdfor the read-merge-write pattern. Re-fetch the body (do not reuse the temp file from step 6). Use/tmp/wtf.implement-task-<task_number>-test-mapping.mdas the temp file.
gh issue edit <task_number> --body-file /tmp/wtf.implement-task-<task_number>-test-mapping.md
Print the updated Task issue URL.
11. Mark implemented and offer to continue
Add the implemented lifecycle label — this is mandatory regardless of invocation mode:
gh issue edit <task_number> --add-label "implemented"
If invoked from the loop (non-interactive mode), skip the ask below and return control to the loop.
Call AskUserQuestion (per ../references/questioning-style.md):
- question: "What's next?"
- header: "Next step"
- options:
- Verify this Task → follow
wtf.verify-task, passing the Task number in as context (recommended) - Open a pull request → follow
wtf.create-pr, passing the Task number and branch in as context - Implement another Task → restart this skill from step 1
- Verify this Task → follow
More from xiduzo/wtf
wtf.write-feature
This skill should be used when a user wants to create a GitHub Feature issue, break down an Epic into user-facing capabilities, write user stories in domain language, or capture what a domain actor can do — for example "create a feature", "write a feature for this epic", "add a feature to an epic", "break this epic into features", "write user stories for this feature", or "describe what this actor can do". Use this skill to write a single Feature; use `wtf.epic-to-features` to generate the full set of Features for an Epic at once. Not applicable to Tasks, Epics, or bug reports.
38wtf.write-task
This skill should be used when a user wants to create a task, write a ticket, decompose a feature into implementable work, break down a story, define a vertical slice for development, or write Gherkin scenarios — for example "create a task", "write a task for this feature", "break this feature into tasks", "define implementation work", or "add a sub-issue to this feature". Guides creation of a GitHub Task issue linked to a parent Feature and Epic, derives Gherkin acceptance scenarios from the Feature's ACs, enforces DDD ubiquitous language in scenarios, and checks for vertical-slice integrity and task dependencies.
38wtf.write-epic
This skill should be used when a user wants to create, draft, or plan a GitHub Epic issue — for example "write an epic", "I want to define a new initiative", "scope out this strategic project", "turn this idea into an epic", "plan work that spans multiple features", or "start from a bounded context". Also use when the user asks to define domain outcomes, capture a large initiative before breaking it into features, or describe work in terms of business goals rather than technical tasks.
38wtf.steer-design
This skill should be used when a team wants to create or refine the design guidelines document — for example "create the design steering doc", "document our design system", "write the design principles", "document our component patterns", "set up the design guidelines", or "update the design doc". Generates docs/steering/DESIGN.md as a living document capturing design principles, the design system, tokens, component patterns, and accessibility standards. Generated once and refined — not regenerated from scratch.
37wtf.reflect
This skill should be used when a developer wants to capture learnings from a difficult session, record what Claude got wrong, save implementation gotchas, or update the steering docs with hard-won knowledge — for example "let's reflect", "capture what we learned", "that was painful, save this", "update the steering docs with what went wrong", "I need to debrief", "what went wrong today", "log this lesson", "save this gotcha", "document this mistake", "I want to write this down before I forget", "add this to the steering docs", or when prompted by the intervention tracker after multiple corrections. Routes each learning into the right steering doc (TECH, QA, DESIGN, or VISION) under a "Hard-Won Lessons" section.
37wtf.verify-task
This skill should be used when a QA engineer wants to test or verify a completed task, run through acceptance criteria, check Gherkin scenarios against the implementation, record pass/fail results, or sign off on a ticket before merge. Triggers on phrases like "verify task #42", "run QA on this issue", "test the acceptance criteria", "sign off on task", "check if this task is ready to merge", "does this task meet its acceptance criteria", "run acceptance tests for task #X", "walk through the Gherkin for task #X", or "I want to test this task".
37