skills/rajivpant/synthesis-skills/synthesis-llm-setup

synthesis-llm-setup

SKILL.md

LLM Setup

Configure Claude Projects, ChatGPT GPTs, Gemini Gems, and other LLM platforms using compiled AI Knowledge content from the ragbot system.

Architecture overview

The AI Knowledge system produces platform-specific content for each LLM's project/custom instruction system.

Operation Where When
Knowledge concatenation (all-knowledge.md) CI/CD (GitHub Actions) Every push to source/
Instruction compilation Local (ragbot compile) When instructions change (rare)
RAG indexing Local (ragbot index) When content changes and RAG is needed

Output structure

ai-knowledge-{name}/
├── compiled/
│   └── {project}/
│       └── instructions/         # LLM-specific custom instructions
│           ├── claude.md
│           ├── chatgpt.md
│           └── gemini.md
└── all-knowledge.md              # Concatenated knowledge (auto-generated by CI/CD)

Key principle: Edit source/ files directly. Knowledge concatenation is automatic.

Knowledge delivery strategy:

  • Claude: GitHub sync the repo (Claude reads all-knowledge.md directly)
  • ChatGPT: Upload all-knowledge.md
  • Gemini: Upload all-knowledge.md (works within the 10-file Gem limit)

Claude Projects setup

Custom instructions

  1. Create a new Claude Project (or open an existing one).
  2. Go to Project Knowledge, then Custom Instructions.
  3. Copy content from compiled/{project}/instructions/claude.md.
  4. Paste into the custom instructions field.

Knowledge files

Option A: GitHub sync (recommended)

  1. Connect your GitHub account to Claude.
  2. Sync the repository containing your ai-knowledge repo.
  3. Claude indexes all-knowledge.md and source files automatically.

Option B: Manual upload

  1. Go to Project Knowledge, then Files.
  2. Upload all-knowledge.md from the repo root.

ChatGPT GPT setup

Creating a GPT

  1. Go to https://chat.openai.com/gpts/editor
  2. Click "Create a GPT".
  3. Configure:
    • Name: Your project name
    • Description: Brief description
    • Instructions: Copy from compiled/{project}/instructions/chatgpt.md

Knowledge files

  1. In the GPT editor, go to the Knowledge section.
  2. Upload all-knowledge.md from the repo root.

Gemini Gems setup

Creating a Gem

  1. Go to https://gemini.google.com/gems
  2. Create a new Gem.
  3. Paste instructions from compiled/{project}/instructions/gemini.md.

Knowledge files

  1. Upload all-knowledge.md from the repo root.
  2. This single file contains all runbooks and datasets merged together.
  3. Works well within Gemini's 10-file limit per Gem.

Other LLMs (Grok, etc.)

  1. Copy instructions from compiled/{project}/instructions/ (use the closest match).
  2. Upload all-knowledge.md from the repo root.

Compilation and inheritance

How it works

Each user compiles projects in their own repo. What content gets included depends on inheritance.

Example: Compiling in ai-knowledge-personal:

compiled/
├── personal/                     # Baseline (ragbot + personal)
├── company/                      # personal + company merged
├── client-a/                     # personal + company + client-a merged
└── client-b/                     # personal + client-b merged

Example: Compiling in ai-knowledge-company (team member without access to personal):

compiled/
├── company/                      # Baseline (ragbot + company, NO personal)
├── client-a/                     # company + client-a (NO personal)
└── client-c/                     # company + client-c

Privacy model

Content is only included if the user has access to the source repo:

  • Private content (ai-knowledge-{personal}) only appears in that user's compilations
  • Team members get team content but not personal content
  • Clients only get client-specific content

Running instruction compilation

# Compile instructions for a project
ragbot compile --project {name}

# Without LLM API calls (just assemble)
ragbot compile --project {name} --no-llm

# Force recompile (ignore cache)
ragbot compile --project {name} --force

# Verbose output
ragbot compile --project {name} --verbose

Knowledge concatenation (all-knowledge.md) is handled automatically by CI/CD -- no manual step needed.

Step-by-step workflow

  1. Compile instructions (only if instructions changed):

    ragbot compile --project {name} --no-llm
    
  2. For each project (e.g., client-a):

    • Claude: Create project, copy compiled/client-a/instructions/claude.md to custom instructions, GitHub sync the repo
    • ChatGPT: Create GPT, copy compiled/client-a/instructions/chatgpt.md to instructions, upload all-knowledge.md
    • Gemini: Create Gem, copy compiled/client-a/instructions/gemini.md to instructions, upload all-knowledge.md
  3. Verify by testing each project with a representative query.

Updating projects

When to update

Knowledge (all-knowledge.md): Updates automatically via CI/CD on every push to source/. No manual step.

Instructions: Recompile only when instructions change (rare):

ragbot compile --project {name} --force

LLM sync:

  • Claude: GitHub sync auto-updates
  • ChatGPT/Gemini: Re-upload all-knowledge.md after source changes

Troubleshooting

"Instructions too long"

  • Move detailed content to knowledge files
  • Check manifest.yaml for token counts
  • Keep instructions focused on identity and behavior

"Knowledge not being used"

  • Verify all-knowledge.md was uploaded correctly
  • Check if content is in instructions vs knowledge
  • For Claude: ensure GitHub sync is active and pointing to the repo

"Inheritance not working"

  • Verify my-projects.yaml exists in the personal repo
  • Check inheritance chain in compile-config.yaml
  • Run with --verbose to see inheritance resolution

"Content from wrong repo appearing"

  • Check which repo you are compiling in
  • Verify you have access to expected repos
  • Remember: content only comes from repos you can access
Weekly Installs
2
GitHub Stars
1
First Seen
1 day ago
Installed on
mcpjam2
github-copilot2
kilo2
replit2
junie2
windsurf2