llm-project-setup
LLM Project Setup
Configure Claude Projects, ChatGPT GPTs, Gemini Gems, and other LLM platforms using compiled AI Knowledge content from the ragbot system.
Architecture overview
The AI Knowledge system produces platform-specific content for each LLM's project/custom instruction system.
| Operation | Where | When |
|---|---|---|
Knowledge concatenation (all-knowledge.md) |
CI/CD (GitHub Actions) | Every push to source/ |
| Instruction compilation | Local (ragbot compile) |
When instructions change (rare) |
| RAG indexing | Local (ragbot index) |
When content changes and RAG is needed |
Output structure
ai-knowledge-{name}/
├── compiled/
│ └── {project}/
│ └── instructions/ # LLM-specific custom instructions
│ ├── claude.md
│ ├── chatgpt.md
│ └── gemini.md
└── all-knowledge.md # Concatenated knowledge (auto-generated by CI/CD)
Key principle: Edit source/ files directly. Knowledge concatenation is automatic.
Knowledge delivery strategy:
- Claude: GitHub sync the repo (Claude reads
all-knowledge.mddirectly) - ChatGPT: Upload
all-knowledge.md - Gemini: Upload
all-knowledge.md(works within the 10-file Gem limit)
Claude Projects setup
Custom instructions
- Create a new Claude Project (or open an existing one).
- Go to Project Knowledge, then Custom Instructions.
- Copy content from
compiled/{project}/instructions/claude.md. - Paste into the custom instructions field.
Knowledge files
Option A: GitHub sync (recommended)
- Connect your GitHub account to Claude.
- Sync the repository containing your ai-knowledge repo.
- Claude indexes
all-knowledge.mdand source files automatically.
Option B: Manual upload
- Go to Project Knowledge, then Files.
- Upload
all-knowledge.mdfrom the repo root.
ChatGPT GPT setup
Creating a GPT
- Go to https://chat.openai.com/gpts/editor
- Click "Create a GPT".
- Configure:
- Name: Your project name
- Description: Brief description
- Instructions: Copy from
compiled/{project}/instructions/chatgpt.md
Knowledge files
- In the GPT editor, go to the Knowledge section.
- Upload
all-knowledge.mdfrom the repo root.
Gemini Gems setup
Creating a Gem
- Go to https://gemini.google.com/gems
- Create a new Gem.
- Paste instructions from
compiled/{project}/instructions/gemini.md.
Knowledge files
- Upload
all-knowledge.mdfrom the repo root. - This single file contains all runbooks and datasets merged together.
- Works well within Gemini's 10-file limit per Gem.
Other LLMs (Grok, etc.)
- Copy instructions from
compiled/{project}/instructions/(use the closest match). - Upload
all-knowledge.mdfrom the repo root.
Compilation and inheritance
How it works
Each user compiles projects in their own repo. What content gets included depends on inheritance.
Example: Compiling in ai-knowledge-personal:
compiled/
├── personal/ # Baseline (ragbot + personal)
├── company/ # personal + company merged
├── client-a/ # personal + company + client-a merged
└── client-b/ # personal + client-b merged
Example: Compiling in ai-knowledge-company (team member without access to personal):
compiled/
├── company/ # Baseline (ragbot + company, NO personal)
├── client-a/ # company + client-a (NO personal)
└── client-c/ # company + client-c
Privacy model
Content is only included if the user has access to the source repo:
- Private content (ai-knowledge-{personal}) only appears in that user's compilations
- Team members get team content but not personal content
- Clients only get client-specific content
Running instruction compilation
# Compile instructions for a project
ragbot compile --project {name}
# Without LLM API calls (just assemble)
ragbot compile --project {name} --no-llm
# Force recompile (ignore cache)
ragbot compile --project {name} --force
# Verbose output
ragbot compile --project {name} --verbose
Knowledge concatenation (all-knowledge.md) is handled automatically by CI/CD -- no manual step needed.
Step-by-step workflow
-
Compile instructions (only if instructions changed):
ragbot compile --project {name} --no-llm -
For each project (e.g., client-a):
- Claude: Create project, copy
compiled/client-a/instructions/claude.mdto custom instructions, GitHub sync the repo - ChatGPT: Create GPT, copy
compiled/client-a/instructions/chatgpt.mdto instructions, uploadall-knowledge.md - Gemini: Create Gem, copy
compiled/client-a/instructions/gemini.mdto instructions, uploadall-knowledge.md
- Claude: Create project, copy
-
Verify by testing each project with a representative query.
Updating projects
When to update
Knowledge (all-knowledge.md): Updates automatically via CI/CD on every push to source/. No manual step.
Instructions: Recompile only when instructions change (rare):
ragbot compile --project {name} --force
LLM sync:
- Claude: GitHub sync auto-updates
- ChatGPT/Gemini: Re-upload
all-knowledge.mdafter source changes
Troubleshooting
"Instructions too long"
- Move detailed content to knowledge files
- Check manifest.yaml for token counts
- Keep instructions focused on identity and behavior
"Knowledge not being used"
- Verify
all-knowledge.mdwas uploaded correctly - Check if content is in instructions vs knowledge
- For Claude: ensure GitHub sync is active and pointing to the repo
"Inheritance not working"
- Verify
my-projects.yamlexists in the personal repo - Check inheritance chain in compile-config.yaml
- Run with
--verboseto see inheritance resolution
"Content from wrong repo appearing"
- Check which repo you are compiling in
- Verify you have access to expected repos
- Remember: content only comes from repos you can access