motherduck-migrate-to-motherduck
Migrate to MotherDuck
Use this skill when the user needs a migration plan from another warehouse, PostgreSQL estate, or mixed analytics stack onto MotherDuck.
This is a use-case skill. It orchestrates motherduck-connect, motherduck-explore, motherduck-load-data, motherduck-model-data, motherduck-query, and motherduck-ducklake.
Start Here: Is a MotherDuck Server Active?
Always determine this before writing a migration plan.
- If a remote MotherDuck MCP server or local MotherDuck server is active, use it.
- Ask which MotherDuck database or workspace will receive the migration if the user has not specified it.
- Explore the live target side first when available:
- existing databases and schemas
- current landing zones
- current analytical tables
- naming conventions
- any partial migration already in place
Also capture the source-side shape:
- source platform
- source table grain
- key metrics
- validation keys
- serving workloads after cutover
If no server is active, ask for representative source and target schemas before finalizing the migration plan.
Use This Skill When
- The user is moving from Snowflake, Redshift, Postgres, or similar.
- The user needs cutover sequencing and validation.
- The user needs to decide between native MotherDuck,
pg_duckdb, or DuckLake. - The migration plan needs rollback, not just a list of copy commands.
Migration Defaults
- native MotherDuck storage first
pg_duckdbwhen extending an existing PostgreSQL estate is the least disruptive path- validate before cutover
- port SQL dialect and data types deliberately before performance tuning
- phased cutover over big-bang replacement
Workflow
- Confirm whether live MotherDuck discovery is available.
- Classify the source system and the target serving pattern.
- Inspect the target-side MotherDuck layout if available.
- Pick the connection and ingestion path.
- Inventory incompatible SQL, functions, data types, and operational assumptions.
- Rebuild the analytical model in DuckDB SQL.
- Run source-vs-target validation.
- Cut over one workload at a time.
When this skill produces a native DuckDB (md:) connection, watermark it with custom_user_agent=agent-skills/2.2.0(harness-<harness>;llm-<llm>). If metadata is missing, fall back to harness-unknown and llm-unknown.
Output
The output of this skill should be:
- the target pattern
- the migration sequence
- the validation plan
- the rollback path
- the first cutover slice
If the caller explicitly asks for structured JSON, return raw JSON only with no Markdown fences or prose before/after it. This is mainly for automated tests, regression checks, or downstream tooling that needs a stable machine-readable shape. Normal human-facing use of the skill can stay in prose unless JSON is explicitly requested.
Use this exact top-level shape when JSON is requested:
{
"summary": {},
"assumptions": [],
"implementation_plan": [],
"validation_plan": [],
"risks": []
}
References
references/MIGRATION_PLAYBOOK.md-- preserved detailed migration guidance that used to live in this skillreferences/MIGRATION_VALIDATION.md-- validation checks and comparison helpers
Runnable Artifact
artifacts/migration_validation_example.py-- MotherDuck-backed Python example for source-vs-target validation and variance reportingartifacts/migration_validation_example.ts-- TypeScript companion artifact with the same validation output contract
Run it with:
uv run --with duckdb python skills/motherduck-migrate-to-motherduck/artifacts/migration_validation_example.py
Run the same validation flow against temporary MotherDuck databases:
MOTHERDUCK_ARTIFACT_USE_MOTHERDUCK=1 \
uv run --with duckdb python skills/motherduck-migrate-to-motherduck/artifacts/migration_validation_example.py
Validate the TypeScript companion artifact:
uv run scripts/test_typescript_artifacts.py
Related Skills
motherduck-connect-- choose the connection path for the target systemmotherduck-explore-- inspect the target-side MotherDuck workspacemotherduck-load-data-- bulk movement and raw landing patternsmotherduck-model-data-- shape the target analytical modelmotherduck-query-- port and validate critical SQLmotherduck-ducklake-- only when open-table-format requirements are explicit
More from motherduckdb/agent-skills
motherduck-security-governance
Explain MotherDuck security, governance, and access-control patterns. Use when a security_compliance_owner, technical_owner, or application_builder is asking about residency, access boundaries, service accounts, isolation, sharing, or governance posture.
47motherduck-query
Execute DuckDB SQL queries against MotherDuck databases. Use when running analytics, aggregations, transformations, or any SQL operation. Covers query best practices, CTEs, window functions, QUALIFY, and performance optimization.
43motherduck-build-data-pipeline
Design an end-to-end MotherDuck pipeline. Use when choosing raw, staging, and analytics boundaries, bulk ingestion paths, transformation sequencing, publication targets, or whether DuckLake is actually required.
43motherduck-pricing-roi
Explain MotherDuck pricing and ROI tradeoffs. Use when an economic_buyer, technical_owner, or analytics_lead is asking about spend, budget guardrails, workload cost drivers, plan fit, or whether MotherDuck is worth adopting.
43motherduck-ducklake
Decide when DuckLake is the right MotherDuck storage pattern. Use when evaluating fully managed DuckLake, BYOB, own-compute DuckLake access, data inlining, object-storage layout, or file-aware maintenance instead of native MotherDuck storage.
43motherduck-create-dive
Create, edit, manage, share, or embed MotherDuck Dives. Use when the work involves Dive authoring, live React + SQL components, MCP get_dive_guide, useSQLQuery, local preview, version history, Dives-as-code, required resources, team sharing, or embedded Dive sessions.
43