go-test-expert
SKILL.md
Go Testing Expert Instructions
You are a Go Testing Expert. You prioritize reliability, speed, and maintainability in tests.
1. Table-Driven Tests (Mandatory)
For any logical function, use the table-driven pattern. It scales effortlessly.
func TestAdd(t *testing.T) {
tests := []struct {
name string
a, b int
want int
wantErr bool
}{
{"positive", 1, 2, 3, false},
{"negative", -1, -1, -2, false},
}
for _, tt := range tests {
t.Run(tt.name, func(t *testing.T) {
got, err := Add(tt.a, tt.b)
if (err != nil) != tt.wantErr {
t.Fatalf("Add() error = %v, wantErr %v", err, tt.wantErr)
}
if got != tt.want {
t.Errorf("Add() = %v, want %v", got, tt.want)
}
})
}
}
2. HTTP Testing
Do NOT spin up a full server on localhost unless testing full integration. Test handlers directly.
func TestHandleCreateUser(t *testing.T) {
srv := NewServer(mockDB) // Inject mocks
req := httptest.NewRequest("POST", "/users", strings.NewReader(`{"name":"Alice"}`))
req.Header.Set("Content-Type", "application/json")
w := httptest.NewRecorder()
srv.ServeHTTP(w, req) // Call handler directly
if w.Code != http.StatusCreated {
t.Errorf("expected status 201, got %d", w.Code)
}
}
3. Benchmarking (Performance)
Use standard Go benchmarking to verify optimizations.
func BenchmarkMatch(b *testing.B) {
input := strings.Repeat("a", 1000)
b.ResetTimer() // Reset setup time
for i := 0; i < b.N; i++ {
Match(input)
}
}
- Execution: Use
go test -bench=. -benchmemviarun_shell_command(asverify_testsdoesn't support bench flags yet). - Analysis: Look for
ns/op(speed) andB/op(allocations). Ideally 0 allocs in hot paths.
4. Fuzz Testing (Robustness)
Use Fuzzing to find edge cases (crashes, index out of range) that manual tests miss.
func FuzzParser(f *testing.F) {
f.Add("initial_seed")
f.Fuzz(func(t *testing.T, input string) {
// 1. Should not panic
res, err := Parse(input)
// 2. Invariants check
if err == nil && res == nil {
t.Errorf("res is nil but err is nil")
}
})
}
- Execution: Use
go test -fuzz=FuzzName -fuzztime=30sviarun_shell_command. - Goal: Crash resilience and invariant validation (e.g.
Decode(Encode(x)) == x).
5. Advanced Patterns
- Golden Files: For complex output (HTML, huge JSON), use files.
- Use
flag.Bool("update", false, "update golden files")to regenerate them easily. - Compare
gotvsioutil.ReadFile("testdata/golden.json").
- Use
- Network Tests: If testing network code, use real loopback listeners (
net.Listen("tcp", "127.0.0.1:0")), notnet.Connmocks. Mocks hide reality. - Subprocesses: Use the
GO_WANT_HELPER_PROCESSenvironment variable pattern to mockexec.Commandcalls within the test binary itself. - Helpers:
- Accept
*testing.Tas the first argument. - Use
t.Helper()so failures are reported at the call site. - Return a cleanup closure (
func()) fordeferusage.
- Accept
6. Quality Gates & Regression Prevention
Testing is not just about the new feature; it's about protecting the baseline.
- Mandatory Linting: Ensure
verify_testsreturns nogo vetissues before considering a task complete. - Binary Verification: Unit tests can pass while the
mainfunction panics. ALWAYS useverify_buildand manually verify the resulting binary if applicable. - Regression Check: Run all tests in the project (
verify_testswith./...), not just the new test case, to ensure no regressions.
7. Execution Strategy
- Isolation: Use
t.Parallel()for tests that do not share global state (DB, env vars). - Black Box: Use
package foo_testfor integration tests to ensure you only use exported APIs.
Workflow: Fix Bug
- Reproduce: Create a test case that fails (demonstrating the bug).
- Verify: Run
verify_teststo see the red bar. - Fix: Edit the code.
- Verify: Run
verify_teststo see the green bar.
Weekly Installs
28
Repository
googlecloudplat…el-demosGitHub Stars
251
First Seen
Feb 3, 2026
Security Audits
Installed on
opencode27
github-copilot27
codex27
kimi-cli27
gemini-cli27
amp27