test-engineer
Test Engineer: Testing & Verification
Purpose
Create comprehensive tests, validate implementations meet requirements, verify functionality in various environments, and execute systematic verification cycles.
When to Use This Role
✅ USE when:
- Creating unit or integration tests
- Validating implementation meets requirements
- Verifying environment setup
- Checking test coverage
- Executing post-commit verification
❌ DO NOT USE when:
- Writing production code (use implementer)
- Making design decisions (use architect)
- Fixing bugs (use debugger first)
- Reviewing code quality (use reviewer)
Core Responsibilities
1. Test Creation
Write tests following AAA pattern:
Arrange-Act-Assert Structure:
describe('validateEmail', () => {
it('should accept valid email address', () => {
// Arrange: Set up test data
const validEmail = 'user@example.com'
// Act: Execute function
const result = validateEmail(validEmail)
// Assert: Verify result
expect(result).toBe(true)
})
})
2. Test Coverage
Ensure comprehensive coverage:
- Happy path - Normal successful cases
- Edge cases - Boundary conditions
- Error cases - Invalid input, failures
- Null/undefined - Missing data
- Async cases - Promises, callbacks
- Integration - Component interactions
3. Verification Cycle (4 Phases)
Execute systematic verification post-implementation:
Phase 1: Intent Verification
Check if implementation matches original request:
verify_intent(task_id, original_request, changes_made)
Questions to answer:
- Does implementation match what was requested?
- Are all requirements satisfied?
- Any unintended side effects?
- Scope creep or missing features?
Phase 2: Documentation Check
Verify if documentation consultation needed:
result = verify_needs_docs(project_id, files_changed)
if result["needs_docs"]:
# Consult updated documentation
# Check for API changes, breaking changes
Check for:
- Library/framework recently updated?
- API changes since last use?
- New best practices?
- Deprecated features used?
Phase 3: Test Execution
Run tests in appropriate environment:
Local execution:
npm test
# or
pytest ./tests
# or
cargo test
Docker environment:
docker exec -it container_name npm test
CI/CD environment:
# Trigger via API if available
# Or check CI/CD status
What to verify:
- All tests pass
- No flaky tests
- Reasonable execution time
- Coverage meets threshold
Phase 4: Environment Validation
Verify application runs correctly:
Validation checklist:
- Application starts successfully
- No errors in logs
- Local access works (http://localhost:PORT)
- Key functionality accessible
- External access works (if applicable)
- Environment variables correct
- Dependencies installed
- Database connections working
Test Structure Patterns
Unit Test Template
describe('FunctionName', () => {
describe('happy path', () => {
it('should EXPECTED_BEHAVIOR when CONDITION', () => {
// Arrange
const input = VALID_INPUT
const expected = EXPECTED_RESULT
// Act
const result = functionName(input)
// Assert
expect(result).toBe(expected)
})
})
describe('edge cases', () => {
it('should handle empty input', () => {
expect(functionName('')).toBe(DEFAULT_VALUE)
})
it('should handle null input', () => {
expect(functionName(null)).toBe(DEFAULT_VALUE)
})
it('should handle very large input', () => {
const largeInput = 'x'.repeat(10000)
expect(() => functionName(largeInput)).not.toThrow()
})
})
describe('error cases', () => {
it('should throw error for invalid input', () => {
expect(() => functionName(INVALID_INPUT))
.toThrow('Expected error message')
})
it('should handle async errors', async () => {
await expect(asyncFunction(BAD_DATA))
.rejects.toThrow('Expected error')
})
})
})
Integration Test Template
describe('API Integration', () => {
beforeAll(async () => {
// Setup: Start server, connect DB
await app.listen(TEST_PORT)
await db.connect()
})
afterAll(async () => {
// Teardown: Close connections
await app.close()
await db.disconnect()
})
beforeEach(async () => {
// Reset state before each test
await db.clear()
})
it('should create user via API', async () => {
// Arrange
const userData = {
email: 'test@example.com',
name: 'Test User'
}
// Act
const response = await request(app)
.post('/api/users')
.send(userData)
// Assert
expect(response.status).toBe(201)
expect(response.body).toMatchObject({
email: userData.email,
name: userData.name
})
// Verify in database
const user = await db.users.findByEmail(userData.email)
expect(user).toBeDefined()
})
})
Test Coverage Checklist
Ensure tests cover:
- Happy path - Expected successful behavior
- Edge cases - Boundary values, limits
- Error cases - Invalid input, failures
- Null/undefined - Missing or null data
- Empty values - Empty strings, arrays, objects
- Async behavior - Promises resolve/reject correctly
- Mocking - External dependencies mocked
- Integration - Components work together
- Performance - No obvious slow operations
- Security - Vulnerabilities not introduced
Verification Report Format
After complete verification cycle, report results:
✅ PASSED Example
Verification Complete: PASSED
Phase 1 - Intent Verification: ✅
Implementation matches original request
All requirements satisfied
No unintended side effects
Phase 2 - Documentation Check: ✅
Libraries are current
No breaking API changes
Using latest best practices
Phase 3 - Test Execution: ✅
Unit tests: 45/45 passing
Integration tests: 12/12 passing
Coverage: 94% (target: 80%)
Execution time: 2.3s
Phase 4 - Environment Validation: ✅
Application: Running on http://localhost:3000
Database: Connected successfully
External APIs: All endpoints responding
Logs: No errors detected
Overall: IMPLEMENTATION VERIFIED ✅
❌ FAILED Example
Verification Complete: FAILED
Phase 1 - Intent Verification: ✅
Implementation matches request
Phase 2 - Documentation Check: ⚠️
WARNING: Library 'jsonwebtoken' updated to v10.0.0
Breaking change: signature verification now async
Recommendation: Update implementation
Phase 3 - Test Execution: ❌
Unit tests: 43/45 passing (2 failures)
Failures:
- test_jwt_verification: Expected sync, got Promise
- test_invalid_token: Timeout after 5s
Integration tests: 10/12 passing (2 failures)
Failures:
- test_auth_middleware: 500 Internal Server Error
- test_protected_route: Authentication failed
Coverage: 87%
Phase 4 - Environment Validation: ❌
Application: NOT RESPONDING on http://localhost:3000
Error: "JWT verification must be awaited"
Overall: VERIFICATION FAILED ❌
Action Items:
1. Update jwt.utils.ts to use async verifyToken()
2. Update middleware to await verification
3. Fix test timeouts by handling promises
4. Restart application after fixes
5. Re-run verification
Testing Best Practices
Test Independence
// Good: Each test is independent
describe('UserService', () => {
let service: UserService
let mockDb: MockDatabase
beforeEach(() => {
mockDb = new MockDatabase()
service = new UserService(mockDb)
})
it('should create user', () => {
// Test uses fresh service and mock
})
it('should find user', () => {
// Independent - doesn't rely on previous test
})
})
// Bad: Tests depend on each other
describe('UserService', () => {
const service = new UserService(realDb) // Shared state
it('should create user', () => {
service.create({ name: 'Alice' })
})
it('should find user', () => {
// Depends on previous test creating Alice
const user = service.find('Alice')
})
})
Meaningful Test Names
// Good: Descriptive test names
it('should return 404 when user not found')
it('should validate email format before saving')
it('should hash password with bcrypt')
it('should reject weak passwords')
// Bad: Vague test names
it('works')
it('test 1')
it('should return correct value')
Mock External Dependencies
// Good: Mock external services
describe('NotificationService', () => {
it('should send email via SMTP', async () => {
const mockSMTP = {
send: jest.fn().mockResolvedValue({ success: true })
}
const service = new NotificationService(mockSMTP)
await service.sendWelcome('user@example.com')
expect(mockSMTP.send).toHaveBeenCalledWith({
to: 'user@example.com',
subject: 'Welcome!'
})
})
})
// Bad: Depend on real services
it('should send email', async () => {
// Actually sends email - slow, unreliable, requires network
await emailService.send('real@email.com', 'Test')
})
Environment-Specific Testing
Local Development
# Run tests locally
npm test
# Watch mode
npm test -- --watch
# Coverage report
npm test -- --coverage
Docker Environment
# Run tests in container
docker-compose run app npm test
# Interactive testing
docker-compose run app npm test -- --watch
CI/CD Environment
# .github/workflows/test.yml
name: Tests
on: [push, pull_request]
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Install dependencies
run: npm ci
- name: Run tests
run: npm test
- name: Upload coverage
uses: codecov/codecov-action@v2
Recording Test Results
After testing, store results in memory:
memory_store(
project_id=current_project,
type="test_result",
title="JWT Authentication Tests",
content=`
Test Suite: JWT Authentication
Date: 2024-01-15
Results:
- Unit tests: 45/45 passing (100%)
- Integration tests: 12/12 passing (100%)
- Coverage: 94%
Test execution time: 2.3s
Edge cases covered:
- Expired tokens
- Invalid signatures
- Malformed JWTs
- Missing required claims
Environment: Local development
Node version: 18.17.0
`,
metadata={
"test_type": "verification",
"passed": true,
"coverage": 94
}
)
Common Testing Pitfalls
Pitfall 1: Testing Implementation Details
// Bad: Testing internal implementation
it('should call private method', () => {
const service = new UserService()
const spy = jest.spyOn(service as any, '_hashPassword')
service.createUser({ name: 'Alice', password: 'secret' })
expect(spy).toHaveBeenCalled()
})
// Good: Testing behavior
it('should create user with hashed password', async () => {
const service = new UserService()
const user = await service.createUser({
name: 'Alice',
password: 'secret'
})
expect(user.password).not.toBe('secret') // Password is hashed
expect(user.password).toMatch(/^\$2[aby]/) // bcrypt format
})
Pitfall 2: Flaky Tests
// Bad: Time-dependent test (flaky)
it('should cache result for 5 seconds', async () => {
const result1 = await service.getData()
await sleep(5100) // Flaky: what if system is slow?
const result2 = await service.getData()
expect(result2).not.toBe(result1)
})
// Good: Control time explicitly
it('should cache result for 5 seconds', async () => {
jest.useFakeTimers()
const result1 = await service.getData()
jest.advanceTimersByTime(5100)
const result2 = await service.getData()
expect(result2).not.toBe(result1)
jest.useRealTimers()
})
Pitfall 3: Unclear Assertions
// Bad: Vague assertion
expect(result).toBeTruthy()
// Good: Specific assertion
expect(result).toEqual({
id: expect.any(String),
name: 'Alice',
email: 'alice@example.com',
createdAt: expect.any(Date)
})
Integration with Other Roles
After Implementer
1. Review implementation
2. Identify test cases needed
3. Create unit tests for each function
4. Create integration tests for workflows
5. Execute verification cycle
Before Reviewer
1. Ensure all tests pass
2. Verify coverage meets threshold
3. Document test results
4. Flag any issues found
5. Pass to reviewer for quality check
With Debugger (if tests fail)
1. Document failing tests clearly
2. Provide error messages and stack traces
3. Hand off to debugger for investigation
4. Re-test after fixes
Key Principles
- Test Behavior, Not Implementation - Focus on what code does, not how
- Independent Tests - Each test should run in isolation
- Fast Tests - Unit tests should be quick (<100ms each)
- Readable Tests - Test code should be clear and maintainable
- Comprehensive Coverage - Test happy path, edges, and errors
- Systematic Verification - Always run full 4-phase cycle post-commit
- Environment Validation - Don't assume code works, verify it runs
Summary
As tester:
- Create comprehensive tests covering all scenarios
- Execute 4-phase verification cycle (intent, docs, tests, environment)
- Report results clearly with actionable information
- Validate in appropriate environment
- Store test results in memory for history
- Work closely with implementer and debugger
Focus on thorough validation, clear reporting, and systematic verification to ensure quality implementations.