forge-idiomatic-engineer
Forge Idiomatic Engineer
Full-stack Rust framework. Single binary, PostgreSQL-backed. Axum + Tokio + SQLx. Macros generate runtime wiring and frontend bindings; each handler must be registered in src/main.rs (macros alone do not wire it in).
Compile-Loop Hard Rules
These cause hours of wasted debugging if missed. Internalize before writing code.
SQLX_OFFLINE=trueis mandatory for anycargo check/cargo buildyou run by hand. CI sets it globally. Without it, sqlx tries to validate everysqlx::query!()against your liveDATABASE_URL— including queries inside publishedforge-runtimecrate files you cannot edit — and you get a wall of "column does not exist" errors in third-party code. The simplest fix iseval "$(forge env)"in your shell rc; otherwiseexport SQLX_OFFLINE=trueby hand.forge checkauto-prepares the offline cache. It detects whensrc/is newer than.sqlx/and runscargo sqlx prepare --workspacebefore the rest of the pipeline, so you don't need to think about prepare ordering. (For rawcargo check, you still need to runforge migrate prepareafter editing anysqlx::query!().) Pass--no-preparein CI where the cache should already be correct.forge migrate preparehard-fails ifcargo-sqlxis missing. Install withcargo install sqlx-cli --no-default-features --features postgresand re-run.- All
forgecommands walk up to findforge.toml. Run them from any subdirectory; the resolved root is printed at start. No need tocdfirst. - If anything in the compile loop feels off, run
forge doctorfirst. It checks rustc,cargo-sqlx,SQLX_OFFLINE,DATABASE_URLreachability, Docker, frontend tooling,forge.tomlsyntax,.sqlx/freshness, and the latest migration's@up/@downmarkers in one shot. - A passing
cargo sqlx prepareis a passing compile. Don't runforge checkpurely to "confirm" what prepare just proved — prepare invokes a fullcargo checkinternally. Runforge checkonly when you have new edits since the last prepare, or to exercise the rest of the validation suite (registration, schema, clippy).
Session Start: Read Once, Trust Memory
Run bash docs/skills/forge-idiomatic-engineer/scripts/orient.sh first. The script walks up to find forge.toml, then prints a structured dump: project name + auth mode + frontend, environment readiness (SQLX_OFFLINE, DATABASE_URL, cargo-sqlx, Docker), .sqlx/ cache freshness, the contents of src/main.rs / src/functions/mod.rs / src/schema/mod.rs, every registered handler grouped by kind, the latest migration, reactivity-enabled tables, and concrete NEXT action hints. One invocation replaces five separate reads.
Always read references/pitfalls.md and references/resilience.md once per session — the script doesn't print them.
Fallback when the script is unavailable (older checkout, sandboxed env): read forge.toml, Cargo.toml, src/main.rs, src/functions/mod.rs, and the most recent file under migrations/.
These files change only when you write to them — re-reading mid-session is almost always wasted context.
After auto-compaction, trust the summary's file inventory. Don't re-read main.rs / mod.rs to confirm something the summary already documented — only re-read if you're about to write to them and need exact current content. If a compaction attaches a file as an <attachment>, treat it as already in context; don't issue a fresh Read.
When you do need to read a file, read it fully in one call. Don't issue overlapping ranges (offset 200 then offset 1) — combine into a single read at offset 1 with a wide limit. Forge handler files are rarely larger than 600 lines.
Handler Types
| Concept | Macro | Struct suffix | Registration |
|---|---|---|---|
| Read-only query | #[forge::query] |
Query |
.register_query::<FnNameQuery>() |
| Data mutation | #[forge::mutation] |
Mutation |
.register_mutation::<FnNameMutation>() |
| Background job | #[forge::job] |
Job |
.register_job::<FnNameJob>() |
| Scheduled task | #[forge::cron] |
Cron |
.register_cron::<FnNameCron>() |
| Durable workflow | #[forge::workflow] |
Workflow |
.register_workflow::<FnNameWorkflow>() |
| Long-running process | #[forge::daemon] |
Daemon |
.register_daemon::<FnNameDaemon>() |
| External HTTP event | #[forge::webhook] |
Webhook |
.register_webhook::<FnNameWebhook>() |
| AI agent tool | #[forge::mcp_tool] |
McpTool |
.register_mcp_tool::<FnNameMcpTool>() |
Or use .auto_register() to pick up all handlers via inventory.
Naming Rules
pub async fnhandlers only — private functions fail codegen.snake_casefn names → macro generatesPascalCase+ type suffix. Do not include the type in the fn name (heartbeat, notheartbeat_daemon, or you getHeartbeatDaemonDaemon).#[forge::model]must be the first attribute on a struct.
Context API at a Glance
Memorize so you don't reach into the forge-core source mid-implementation:
ctx.db()→ForgeDb(asqlx::Executor). Pass directly to query macros:sqlx::query_as!(...).fetch_one(ctx.db()).await?.ctx.conn().await?→ForgeConn<'_>(transactional, mutations only). Pass&mut connto query macros.ctx.user_id()→Result<Uuid>onQueryContextandMutationContext. ReturnsForgeError::Unauthorizedif no principal. There is noctx.auth()method onMutationContext.ctx.db_conn()→DbConn<'_>for shared helpers that must work in both queries and mutations.DbConnhas an inverted convention (call.fetch_*on theDbConn, passing the query) — seereferences/patterns.md.- Let type inference name the bindings (
let mut conn = ctx.conn().await?). Don't importForgeConn/ForgeDb/DbConnand write explicit types unless a helper signature requires it.
Workflow
- Orient — read the session-start file list above plus
references/pitfalls.mdandreferences/resilience.md. Detect frontend viafrontend/package.json(Svelte) orfrontend/Cargo.toml(Dioxus). - Plan the slice — decide which handlers, migrations, and frontend changes belong in this PR before writing code. Surgical, vertical, one feature.
- Checkpoint loop, one handler at a time:
- Run
forge new <kind> <name>instead of writing the file by hand. It scaffolds the right macro defaults, appendspub mod <name>;tosrc/functions/mod.rs, and insertsmod functions;insrc/main.rsif missing. Kinds:query,mutation,job,cron,workflow,daemon,webhook,mcp_tool,model,enum. - Edit the scaffolded file: replace placeholder SQL/business logic with the real implementation.
forge check. It auto-prepares the.sqlx/cache when sources are newer, so you don't need a separateforge migrate preparestep in the common case.- If it fails, fix the root cause and re-run only the failing step. Do not write the next handler with errors outstanding.
- Move to the next handler.
- Run
forge generateafter backend changes settle. Never edit generated files.- Frontend — wire the UI against the generated bindings.
forge testfor Playwright E2E. - Final pass —
forge checkclean,forge testgreen, write a brief change summary.
When the user says "fix it" / "can you fix it" after you've diagnosed a problem, fix it — don't ask for re-confirmation. Only pause to confirm for destructive actions (data deletion, schema drops, force pushes).
Architectural Defaults (choose upfront)
- Auth: Social OAuth, password + HS256, or RS256 — pick before coding (see
patterns.md). Social logins must link via theuser_identitiestable. - Env:
ctx.env_require()/ctx.env_or(), neverstd::env::var(). - HTTP:
ctx.http()for RPC,ctx.raw_http()when you needbytes_stream()or custom redirect policy. - SQL:
sqlx::query!()/query_as!()bang-macros only. Runforge migrate prepareafter schema or query changes — see Compile-Loop Hard Rules. - Jobs/workflows: dispatch only inside mutations (transactions are on by default). Never set
transactional = falseon a mutation that dispatches. - Shared logic: extract to
src/utils/the moment two handlers need it.
Reference Selection Guide
| Task | Reference |
|---|---|
| Any new handler (mandatory read) | references/resilience.md |
| Macros, context, errors, configuration, CLI | references/api.md |
| Backend patterns: jobs, workflows, auth, webhooks, compile loop | references/patterns.md |
| Copy-paste recipes (user fetch, plan gating, email, S3, payments, AI) | references/recipes.md |
| Frontend principles (reactivity, subscriptions, errors, uploads) | references/frontend.md |
| SvelteKit specifics (runes, stores, auth helper) | references/frontend/svelte.md |
| Dioxus specifics (hooks, signals, auth keying) | references/frontend/dioxus.md |
| Writing tests (backend builders + Playwright scenarios) | references/testing.md |
| Debugging build or runtime errors | references/pitfalls.md |
Engineering Principles
- Design for failure: auth drops, entities vanish, networks fail. See
resilience.md. - Zero dead code: delete unused code and replaced patterns entirely. Workspace lints deny
dead_code,unwrap_used,panic,indexing_slicing,unsafe_code. - Surgical diffs: thin vertical slice per PR.
- Boundary validation: validate at handler entry, return
ForgeErrorvariants (neverunwrap/expect/panic!). - Scope enforcement (compile-time enforced by the macro): private queries must filter by
user_id/owner_id.ctx.user_id()for the principal. Opt out with#[query(unscoped)]only for shared/admin data. - Transactional dispatch (compile-time enforced):
dispatch_jobandstart_workflowrequire a transactional mutation. Since transactions are on by default, just don't settransactional = falseon mutations that dispatch. - System tables are off-limits (
forge checkenforced): neverINSERT/UPDATE/DELETEonforge_*tables. Usedispatch_job,start_workflow,record_signal. - Migrations:
-- @up/-- @downmarkers. Enable reactivity withSELECT forge_enable_reactivity('table_name');. NoIF NOT EXISTS. - Not-found handling:
fetch_optional().await?.ok_or_else(|| ForgeError::NotFound(format!(...))).
Output Contract
On completion, report:
- Changes: files touched, failure modes handled, tests added (include failure paths).
- Verification: commands run, blockers, final
forge checkresult. - Review: resilience gaps, assumptions.
Every handler must survive revoked auth, deleted entities, concurrent modification, and network drops.