testing-test-cases
<skill_overview> Create well-structured test cases that are clear, reproducible, and maintainable Documenting test scenarios Creating test case specifications Reviewing test case quality </skill_overview>
<test_case_structure> Standard fields in a test case Unique identifier for the test case TC_MODULE_NUMBER (e.g., TC_AUTH_001, TC_ORD_012) Easy reference and tracking
<field name="Title">
<description>Clear, concise description of what is being tested</description>
<format>Should start with verb, be specific and meaningful</format>
<examples>
<good>Verify user login with valid credentials</good>
<good>Check password validation for weak password</good>
<good>Test product search with empty results</good>
<bad>Login test</bad>
<bad>Test form</bad>
<bad>Check it works</bad>
</examples>
</field>
<field name="Priority">
<description>Importance level of the test case</description>
<values>
<value>Critical - Must pass before release</value>
<value>High - Important feature, should be tested</value>
<value>Medium - Standard functionality</value>
<value>Low - Edge cases, nice to test</value>
</values>
</field>
<field name="Module">
<description>Functional area or component being tested</description>
<examples>Authentication, User Profile, Orders, Payments, Search</examples>
</field>
<field name="Pre-conditions">
<description>State and requirements before test execution</description>
<examples>
<example>User account exists with email: test@example.com</example>
<example>User is logged in with valid credentials</example>
<example>Database contains at least one product</example>
<example>Application is running and accessible</example>
</examples>
</field>
<field name="Test Steps">
<description>Detailed, numbered sequence of actions</description>
<guidelines>
<guideline>Each step should be clear and unambiguous</guideline>
<guideline>Steps should be ordered and sequential</guideline>
<guideline>Include navigation details (page names, URLs)</guideline>
<guideline>Specify element identifiers (IDs, labels, placeholders)</guideline>
<guideline>Include data to be entered in forms</guideline>
</guidelines>
</field>
<field name="Test Data">
<description>Specific data values used in the test</description>
<types>
<type>Input values - Data entered into forms or fields</type>
<type>Test credentials - Usernames, passwords, tokens</type>
<type>URLs - Specific endpoints or pages</type>
<type>IDs - Resource identifiers for lookup/update/delete</type>
</types>
</field>
<field name="Expected Result">
<description>What should happen if the system works correctly</description>
<guidelines>
<guideline>Be specific and observable</guideline>
<guideline>Include UI changes, redirects, messages</guideline>
<guideline>Verify both successful outcomes and error cases</guideline>
<guideline>Check data persistence if applicable</guideline>
</guidelines>
</field>
<field name="Actual Result">
<description>What actually happened during test execution</description>
<purpose>Documented after running the test</purpose>
<fill>Filled during test execution</fill>
</field>
<field name="Status">
<description>Test outcome after execution</description>
<values>
<value>Pass - Test passed as expected</value>
<value>Fail - Test failed, system did not behave as expected</value>
<value>Blocked - Cannot execute due to defect or dependency</value>
<value>Skipped - Not executed, intentionally omitted</value>
<value>Not Run - Not yet executed</value>
</values>
</field>
<field name="Defect ID">
<description>Link to bug report if test failed</description>
<format>BUG-123, JIRA-456, etc.</format>
</field>
<field name="Notes/Comments">
<description>Additional observations, issues, or recommendations</description>
<examples>Workarounds found, additional edge cases discovered</examples>
</field>
<test_case_template> Standard template for test case documentation Test Case ID: TC_MODULE_XXX Title: [Clear description of what is tested] Priority: [Critical/High/Medium/Low] Module: [Functional area] Designed By: [Author name] Date: [Creation date] Pre-conditions: Pre-condition 1 Pre-condition 2 Test Steps: 1. [Action description] 2. [Action description] 3. [Action description] Test Data: Field: value Field: value Expected Result: Expected outcome 1 Expected outcome 2 Actual Result: [Filled during execution] Status: [Pass/Fail/Blocked/Skipped/Not Run] Defect ID: [If failed] Notes: [Additional observations] </test_case_template>
<writing_test_steps> Best practices for writing test steps Be specific and precise in descriptions Use clear, simple language One action per step Include navigation details (which page, which section) Specify exact data to enter Identify elements clearly (buttons, links, fields) Number steps sequentially Keep steps at appropriate granularity
<defining_expected_results> Writing clear and verifiable expected results Describe observable outcomes Be specific about what to check Include UI changes, page redirects, messages Verify data persistence when applicable Check multiple aspects where relevant
<pre_conditions> Documenting required state before test execution User account status and data User exists, user is active, user has specific role
<type name="Application State">
<description>Application configuration and status</description>
<examples>Application is running, feature is enabled, database is accessible</examples>
</type>
<type name="Data State">
<description>Required data in system</description>
<examples>Product exists in catalog, order has specific status, user has permissions</examples>
</type>
<type name="Environment State">
<description>Environment and configuration</description>
<examples>Running on staging environment, API endpoints are accessible, test data is available</examples>
</type>
<test_data_management> Managing test data in test cases Use generic test data when possible Avoid hard-coded production data Document data dependencies between tests Specify where data comes from (pre-created, generated, API)
<data_types> Fixed values that don't change Email addresses, usernames, product names
<type name="Dynamic Data">
<description>Data generated or fetched during test</description>
<examples>Timestamps, random IDs, generated user data</examples>
</type>
<type name="Pre-existing Data">
<description>Data already in system before test</description>
<examples>Existing users, products, orders</examples>
</type>
</data_types> </test_data_management>
<test_case_classification> Categorizing test cases for better organization
<test_case_review> Quality criteria for test cases <quality_checklist> Test case ID is unique and follows naming convention Title clearly describes what is being tested Priority is appropriate for the feature Pre-conditions are clearly stated Test steps are clear, sequential, and unambiguous Each step describes a single action Test data is specified where needed Expected results are specific and verifiable Test case is independent and can run alone Test case is reproducible by another tester </quality_checklist> </test_case_review>
<linking_test_cases> Connecting related test cases Test case requires another test to pass first Test "Create order" depends on "Create user"
<relationship name="Related">
<description>Tests cover related functionality</description>
<example>Login tests are related to registration tests</example>
</relationship>
<relationship name="Duplicate">
<description>Tests cover same scenario</description>
<action>Consider merging or clarifying differences</action>
</relationship>
<versioning_test_cases> Managing test case changes over time Update test cases when requirements change Archive outdated test cases instead of deleting Track version history for audits Communicate changes to team </versioning_test_cases>