rlm-distill-ollama
Dependencies
This skill requires Python 3.8+ and standard library only. No external packages needed.
To install this skill's dependencies:
pip-compile ./requirements.in
pip install -r ./requirements.txt
See ./requirements.txt for the dependency lockfile (currently empty — standard library only).
/rlm-factory:distill
Summarize files into the RLM Summary Ledger. Two paths depending on context:
For detailed execution protocol, see agent:
rlm-distill-agent
Path 1 -- Agent Distillation (default, fast, no Ollama)
The agent reads each file and writes a high-quality summary via inject_summary.py.
Use for 1-50 files. The agent is faster and produces better summaries than local Ollama.
python3 .agents/skills/rlm-distill-ollama/scripts/inject_summary.py \
--profile project \
--file path/to/file.md \
--summary "Your agent-generated summary here."
Path 2 -- Ollama Batch (offline, bulk, 50+ files)
Requires Ollama running locally (ollama serve, model: granite3.2:8b).
# All files in profile scope
python3 .agents/skills/rlm-distill-ollama/scripts/distiller.py --profile project
# Single file
python3 .agents/skills/rlm-distill-ollama/scripts/distiller.py --profile project --file path/to/file.md
# Changed in last 2 hours
python3 .agents/skills/rlm-distill-ollama/scripts/distiller.py --profile project --since 2
| Profile | Flag | Cache file |
|---|---|---|
| Docs / protocols | --profile project |
rlm_summary_cache.json |
| Plugins / scripts | --profile tools |
rlm_tool_cache.json |
More from richfrem/agent-plugins-skills
markdown-to-msword-converter
Converts Markdown files to one MS Word document per file using plugin-local scripts. V2 includes L5 Delegated Constraint Verification for strict binary artifact linting.
52excel-to-csv
>
32zip-bundling
Create technical ZIP bundles of code, design, and documentation for external review or context sharing. Use when you need to package multiple project files into a portable `.zip` archive instead of a single Markdown file.
29learning-loop
(Industry standard: Loop Agent / Single Agent) Primary Use Case: Self-contained research, content generation, and exploration where no inner delegation is required. Self-directed research and knowledge capture loop. Use when: starting a session (Orientation), performing research (Synthesis), or closing a session (Seal, Persist, Retrospective). Ensures knowledge survives across isolated agent sessions.
26ollama-launch
Start and verify the local Ollama LLM server. Use when Ollama is needed for RLM distillation, seal snapshots, embeddings, or any local LLM inference — and it's not already running. Checks if Ollama is running, starts it if not, and verifies the health endpoint.
26spec-kitty-checklist
A standard Spec-Kitty workflow routine.
26