skills/lookatitude/beluga-ai/provider-implementation

provider-implementation

SKILL.md

Provider Implementation

Checklist

  1. Implement the full interface (ChatModel, Embedder, VectorStore, STT, TTS, etc.).
  2. Register via init() with parent package's Register().
  3. Map provider errors to core.Error with correct ErrorCode.
  4. Support context cancellation.
  5. Include token/usage metrics where applicable.
  6. Compile-time check: var _ Interface = (*Impl)(nil).
  7. Unit tests with mocked HTTP responses (httptest).

File Structure

llm/providers/openai/
├── openai.go          # Implementation + New() + init()
├── stream.go          # Streaming
├── errors.go          # Error mapping
├── openai_test.go     # Tests
└── testdata/          # Recorded HTTP responses

Template

var _ llm.ChatModel = (*Model)(nil)

func init() {
    llm.Register("openai", func(cfg llm.ProviderConfig) (llm.ChatModel, error) { return New(cfg) })
}

func New(cfg llm.ProviderConfig) (*Model, error) {
    if cfg.APIKey == "" { return nil, &core.Error{Op: "openai.new", Code: core.ErrAuth, Message: "API key required"} }
    return &Model{client: newClient(cfg.APIKey, cfg.BaseURL), model: cfg.Model}, nil
}

func (m *Model) Stream(ctx context.Context, msgs []schema.Message, opts ...llm.GenerateOption) iter.Seq2[schema.StreamChunk, error] {
    return func(yield func(schema.StreamChunk, error) bool) { /* stream implementation */ }
}

Error Mapping

switch apiErr.StatusCode {
case 401: code = core.ErrAuth
case 429: code = core.ErrRateLimit
case 408, 504: code = core.ErrTimeout
case 400: code = core.ErrInvalidInput
}

See docs/providers.md for provider categories and priorities.

Weekly Installs
7
GitHub Stars
9
First Seen
Feb 10, 2026
Installed on
codex7
opencode6
gemini-cli6
github-copilot6
kimi-cli6
amp6