divergence-loop

Installation
SKILL.md

Divergence Loop

A brainstorming skill that forces genuine creative divergence by exploiting context isolation between generation rounds, then bridges wild ideas back to reality through constraint-based convergence.

Why Context Reset Matters

LLMs have a strong convergence bias -- given enough context, they drift toward safe, structured, "reasonable" outputs. Prompt-level instructions ("don't converge", "be wild") fail because the model's training rewards coherence over chaos. The only reliable way to maintain divergence across multiple rounds is context isolation: each generation round runs in a fresh execution context with zero memory of prior rounds. Files are the sole communication channel.

How to achieve context isolation depends on the platform. Use whatever mechanism provides a clean context with no carry-over from prior rounds (e.g., spawning a new agent, opening a new chat, or calling an API with an independent message history). The specific tool does not matter — what matters is that each round's generator has NO access to ideas from other rounds.

Workflow Overview

User Theme --> [Divergence Loop] --> [Convergence] --> Multiple Final Ideas
                    |                      |
              N rounds of:           User constraints +
              Persona-injected       last round ideas
              context-isolated       = filtered output
              generation + mashup
Related skills

More from caldiaworks/caldiaworks-marketplace

Installs
3
First Seen
Mar 23, 2026