grad-dual-process
Installation
SKILL.md
Dual-Process Theory
Overview
Dual-process theory (Kahneman, 2011; Stanovich & West, 2000) distinguishes two modes of cognitive processing: System 1 (fast, automatic, heuristic-driven) and System 2 (slow, deliberate, rule-based). Most judgments default to System 1, which is efficient but prone to systematic biases when heuristics misfire.
When to Use
- Explaining why stakeholders make predictable judgment errors under time pressure or complexity
- Designing decision environments (nudges, checklists) that compensate for System 1 defaults
- Auditing existing processes to identify where heuristic shortcuts introduce risk
- Evaluating when intuitive expertise is reliable vs. when it is misleading
When NOT to Use
- When decisions are already well-structured with algorithmic procedures (bias is engineered out)
- As an excuse to dismiss all intuitive judgment — expert intuition can be accurate in high-validity environments
- When the problem is motivational rather than cognitive (people know the right answer but choose otherwise)
Assumptions
IRON LAW: System 1 operates by DEFAULT — System 2 engagement
requires cognitive effort and is easily depleted. Under time
pressure, cognitive load, or ego depletion, System 1 dominates
and heuristic biases amplify.
Key assumptions:
- System 1 and System 2 are metaphors for processing modes, not discrete brain systems
- Heuristics are generally adaptive — biases emerge at the boundary conditions
- System 2 can override System 1, but only when cued and when cognitive resources are available
Methodology
Step 1 — Identify the Judgment or Decision Context
Characterize the decision: time pressure, complexity, familiarity, stakes, emotional involvement.
Step 2 — Classify Processing Mode
| Feature | System 1 | System 2 |
|---|---|---|
| Speed | Fast, automatic | Slow, effortful |
| Awareness | Unconscious | Conscious |
| Capacity | High (parallel) | Low (serial) |
| Basis | Heuristics, associations | Rules, logic |
| Error type | Systematic biases | Computational mistakes |
| Triggered by | Default, familiarity | Novelty, conflict detection |
Step 3 — Map Relevant Heuristics and Biases
Common System 1 heuristics and their failure modes:
- Availability: judge frequency by ease of recall — biased by salience and recency
- Representativeness: judge probability by similarity — ignores base rates
- Anchoring: estimate by adjusting from initial value — insufficient adjustment
- Affect: judge risk/benefit by emotional reaction — neglects statistical evidence
Step 4 — Design Intervention
- De-bias: slow down decisions, require explicit justification, use pre-mortems
- Nudge: restructure choice architecture to align System 1 defaults with desired outcomes
- Leverage: use System 1 strengths (pattern recognition) in high-validity, rapid-feedback domains
Output Format
## Dual-Process Analysis: [Context]
### Decision Environment
- Time pressure: [High/Medium/Low]
- Complexity: [High/Medium/Low]
- Emotional involvement: [High/Medium/Low]
- Dominant processing: [System 1 / System 2 / Mixed]
### Heuristic-Bias Map
| Heuristic | Bias Triggered | Evidence | Risk Level |
|-----------|---------------|----------|------------|
| [heuristic] | [bias] | [observation] | [High/Med/Low] |
### Intervention Design
1. [De-biasing or nudge strategy]
2. [Process change]
3. [Environmental redesign]
Gotchas
- System 1/System 2 is a useful metaphor, not a literal brain architecture — avoid reifying the distinction
- Expert intuition (System 1) is highly accurate in domains with clear feedback and regular patterns (e.g., chess, firefighting)
- De-biasing training has poor transfer — changing the environment is more effective than training individuals
- Cognitive depletion effects are debated; do not assume a simple "willpower battery" model
- System 2 is not inherently "better" — it is slower, more costly, and still subject to motivated reasoning
- People often confuse confidence with accuracy; high System 1 confidence does not indicate correctness
References
- Kahneman, D. (2011). Thinking, fast and slow. Farrar, Straus and Giroux.
- Stanovich, K. E. & West, R. F. (2000). Individual differences in reasoning: implications for the rationality debate. Behavioral and Brain Sciences, 23(5), 645-665.
- Tversky, A. & Kahneman, D. (1974). Judgment under uncertainty: heuristics and biases. Science, 185(4157), 1124-1131.
Weekly Installs
14
Repository
asgard-ai-platf…m/skillsGitHub Stars
125
First Seen
6 days ago
Security Audits