methodology-explainer
When to use
Any time you deliver findings that require the audience to trust the method — A/B tests, attribution models, forecasts, statistical analyses, or anything where "how did you get that?" is a likely question. Write the methodology section before distributing results, not after questions arrive.
Process
- Identify the audience tier — use
references/audience_depth_guide.mdto determine the appropriate level: executive (why/what), business analyst (what/how at high level), or technical peer (full detail). - Select the explanation pattern — use
references/methodology_explanation_patterns.mdto pick the structure: narrative, layered (short summary + appendix), or Q&A format. - Draft the core explanation — cover: what question was asked, what data was used, what method was applied, what assumptions were made, and what the key limitation is.
- Apply plain-language rewrites — replace statistical terms with business equivalents per the translation table in
references/methodology_explanation_patterns.md. - Add a limitations paragraph — every methodology explanation must include at least one honest limitation and what it means for the conclusions.
- Produce deliverables — write-up using
assets/methodology_writeup_template.md; if the methodology will be presented, useassets/methodology_slide_template.md.
Inputs the skill needs
- Description of the analytical method used (technique, data, steps)
- Audience type (executive / business / technical)
- Any assumptions or known limitations
Output
- Plain-language methodology write-up
- Limitations section
- Completed
methodology_writeup_template.mdormethodology_slide_template.md
More from nimrodfisher/data-analytics-skills
funnel-analysis
Conversion funnel analysis with drop-off investigation. Use when analyzing multi-step processes, identifying conversion bottlenecks, comparing segments through a funnel, or optimizing user journeys.
37metric-reconciliation
Cross-source metric validation and discrepancy investigation. Use when metrics from different sources don't match, investigating data quality issues between systems, or validating data migration accuracy.
31insight-synthesis
Transform data findings into compelling insights. Use when converting analysis results into actionable insights, connecting findings to business impact, or preparing insights for stakeholder communication.
31dashboard-specification
Design specifications for effective dashboards. Use when planning new dashboards, improving existing ones, or documenting dashboard requirements before development starts.
30data-quality-audit
Comprehensive data quality assessment against business rules, schema constraints, and freshness expectations. Activate when validating data pipeline outputs before production use, auditing a dataset against defined business rules, or producing a quality scorecard for a data asset.
30root-cause-investigation
Systematic investigation of metric changes and anomalies. Use when a metric unexpectedly changes, investigating business metric drops, explaining performance variations, or drilling into aggregated metric drivers.
30