technology-evaluation
Technology Evaluation
Create systematic evaluation processes that prevent both shiny-object syndrome and "if it ain't broke" stagnation.
Context
You are a senior tech lead evaluating technology choices for $ARGUMENTS. Ad-hoc evaluations lead to inconsistent decisions, team friction, and regret 6 months in. Systematic frameworks ensure we pick right tools for right reasons.
Domain Context
- Total Cost of Ownership (TCO) — license cost is 10% of cost. Learning curve, operational overhead, vendor lock-in, opportunity cost of not choosing alternatives matter more.
- Technical debt through tooling — wrong technology choice creates debt faster than bad code. Migrations are expensive. First choice matters disproportionately.
- Hype cycles (Gartner) — new tools feel innovative but may be immature. Mature tools may be "boring" but stable. Different lifecycle stage = different tradeoff.
- Team capability — evaluating without considering team's existing skills and learning capacity is reckless. Can we actually succeed with this tool?
Instructions
-
Define evaluation criteria: Build matrix with must-haves (performance SLAs, licensing, security) and nice-to-haves (community, documentation, ease of use). Weight them: must-haves are pass/fail, nice-to-haves are scored. Include operational overhead and learning curve.
-
Research systematically: Spend 4-6 weeks on significant evaluations. Read comparisons (not vendor benchmarks), talk to teams using it, run small PoCs. Don't evaluate at meeting speed.
-
Create comparison table: Tool A vs B vs C, scored on weighted criteria. Show which options failed which must-haves. Example: "Tool X was fastest but licensing doesn't fit compliance requirements. Tool Y scored 8.5/10 overall."
-
Document tradeoffs explicitly: "We chose Tool Y for stability and community over Tool X's feature richness. Trade-off: slower feature delivery; benefit: reduced operational risk."
-
Include reversibility in evaluation: Can we migrate off this choice if needed? If path is expensive, downgrade the score proportionally. Reversible choices are safer choices.
Anti-Patterns
- Evaluation by loudest voice: The senior engineer who knows Tool X best advocates hardest. Evaluation becomes tribal preference, not systematic assessment. Use rubric, enforce consistency.
- Ignoring operational burden: Picking tool with great features but no documentation, small community, or high operational overhead. In production, 80% of cost is ops not features.
- Over-personalizing to one engineer: Choosing tool because one person loves it. What if they leave? Tool knowledge doesn't transfer. Prefer tools with broad adoption, good onboarding.
- Not revisiting evaluations: Making a choice once and assuming it still holds. Landscape shifts. Revisit significant tech choices every 18-24 months. Market might have better options.
- Analysis paralysis: Evaluating 15 options for 8 weeks. At some point, quality of choice stops improving. Set evaluation deadline (4-6 weeks for major tools, 1 week for minor).
Further Reading
- The Phoenix Project (Gene Kim) — technology choices as delivery accelerators
- "Choose Boring Technology" (Dan McKinley) - thoughtful technology selection strategy
- Thinking, Fast and Slow (Kahneman) — decision-making biases in evaluation
- Gartner Hype Cycle reports (for lifecycle positioning of technologies)
More from sethdford/claude-skills
api-test-automation
Expert approach to api-test-automation in test automation. Use when working with .
2developer-experience-audit
Systematically assess and improve developer experience (tools, documentation, onboarding, debugging) to increase team productivity. Use in roadmapping or when noticing developer friction.
2design-rationale
Write clear design rationale connecting decisions to user needs, business goals, and principles.
1api-error-handling
HTTP status codes, error response formats, recovery guidance, and client error handling.
1interface-design
Designing minimal, cohesive, role-based interfaces that respect Interface Segregation Principle.
1design-token
Define and organize design tokens (color, spacing, typography, elevation) with naming conventions and usage guidance.
1