peer-review-template
When to use
Before any analysis that will influence a significant decision is delivered to stakeholders. Peer review should be part of the standard delivery checklist for: dashboards going into production, reports used for strategic decisions, A/B test conclusions, and any analysis that will be cited externally.
Process
- Agree scope of review — clarify with the author what kind of review is needed: logic check, statistical validity, code review, or presentation clarity. Use
references/peer_review_framework.mdto set expectations. - Review analytical rigour — work through
references/analytical_rigor_checklist.md: are the question and method aligned? Are assumptions valid? Is the conclusion supported by the data? - Review code or SQL — if the analysis involves code, apply
references/code_review_for_analysis.md: reproducibility, correctness, readability, and performance. - Write feedback — use the feedback structure in
assets/peer_review_template.md: must-fix issues, should-fix suggestions, and optional improvements. Be specific; "this is unclear" is not actionable. - Author responds — the author addresses each point and notes disposition (fixed / accepted as-is with rationale / deferred); use
assets/review_response_template.md. - Close the review — reviewer confirms must-fix items are resolved and signs off; document the outcome in
assets/peer_review_template.md.
Inputs the skill needs
- Analysis output to review (notebook, report, dashboard spec, or SQL)
- Review scope agreed with author
- Reviewer name and role
Output
- Completed review with categorised feedback (
peer_review_template.md) - Author response log (
review_response_template.md) - Sign-off confirmation
More from nimrodfisher/data-analytics-skills
funnel-analysis
Conversion funnel analysis with drop-off investigation. Use when analyzing multi-step processes, identifying conversion bottlenecks, comparing segments through a funnel, or optimizing user journeys.
37metric-reconciliation
Cross-source metric validation and discrepancy investigation. Use when metrics from different sources don't match, investigating data quality issues between systems, or validating data migration accuracy.
31insight-synthesis
Transform data findings into compelling insights. Use when converting analysis results into actionable insights, connecting findings to business impact, or preparing insights for stakeholder communication.
31dashboard-specification
Design specifications for effective dashboards. Use when planning new dashboards, improving existing ones, or documenting dashboard requirements before development starts.
30data-quality-audit
Comprehensive data quality assessment against business rules, schema constraints, and freshness expectations. Activate when validating data pipeline outputs before production use, auditing a dataset against defined business rules, or producing a quality scorecard for a data asset.
30root-cause-investigation
Systematic investigation of metric changes and anomalies. Use when a metric unexpectedly changes, investigating business metric drops, explaining performance variations, or drilling into aggregated metric drivers.
30