fp-add-spec

SKILL.md

Add E2E Spec

Create a new E2E_TESTS.md specification file for a package by analyzing its source code, commands, and public API.

Additional instructions from the user: "$ARGUMENTS". Ignore if empty.

Phase 1: Analyze Package

  1. Identify the target package:

    • If $ARGUMENTS specifies a package path, use that.
    • If in a package directory, use the current package.
    • Otherwise, ask the user which package to analyze.
  2. Read the package to understand:

    • What the package does (README, package manifest, main entry points)
    • Public API / exported commands / CLI interface
    • External dependencies (what it calls: Docker, HTTP APIs, CLI tools)
    • Configuration files it reads or creates
    • State it manages (databases, file stores, caches)
  3. Check if an E2E_TESTS.md already exists for this package. If it does, inform the user and suggest using update-spec instead.

Phase 2: Draft Spec

  1. Determine the suites. Each major command, workflow, or feature area becomes a suite.

  2. Include an Index section at the top of the spec (after the title and description) with a markdown link to every suite (e.g., - [Suite Name](#suite-name)).

  3. For each suite, draft these sections using heading levels appropriate to their nesting depth (e.g., if the suite is an H2, then these sections are H3; if the suite is an H3, they are H4; and so on):

    • Preconditions (heading with bullet list): What setup is needed (temp dir, git repo, config files, mock tools)
    • Features (heading) containing individual features as subheadings one level deeper, across all categories:
      • core: Primary happy-path scenarios
      • edge: Boundary conditions (empty inputs, very long strings, special characters)
      • error: Failure modes (missing config, invalid input, unavailable dependencies)
      • side-effect: Observable interactions (hooks, notifications, file creation)
      • idempotency: Safe repetition (re-running commands, re-creating resources)
    • Postconditions (heading with bullet list): Verifiable end states
  4. For features requiring real external services, add <!-- skip: ... --> metadata with clear documentation on how to unskip.

  5. Ensure every feature has:

    • A <!-- category: ... --> comment
    • At least one concrete, testable assertion bullet
    • Present tense, declarative language

Phase 3: Review

  1. Present the complete draft spec to the user.
  2. Show a summary:
    • Number of suites
    • Features per suite by category
    • Total feature count
    • Any features marked as skip
  3. Ask the user for feedback and adjustments.

Phase 4: Write

  1. Write the approved spec to E2E_TESTS.md in the package directory.
  2. If a root-level docs/E2E_TESTS.md or E2E_TESTS.md doesn't exist yet, offer to create one with project-wide testing constraints (isolation rules, cleanup patterns, environment variable conventions).
  3. Inform the user they can now run:
    • review-spec to validate the format
    • generate to create test files from the spec
Weekly Installs
2
GitHub Stars
3
First Seen
3 days ago
Installed on
amp2
cline2
opencode2
cursor2
kimi-cli2
codex2