dspy-ollama

Installation
SKILL.md

Ollama — Run DSPy with Local Models

Guide the user through running DSPy with local models via Ollama. No API keys, no cloud costs, full privacy.

What is Ollama

Ollama is a local LLM runner (166k+ GitHub stars) that wraps llama.cpp. It downloads, manages, and serves models locally with a simple CLI. DSPy connects to it through LiteLLM's ollama_chat/ provider.

Setup

Install Ollama

# macOS
brew install ollama

# Linux
curl -fsSL https://ollama.com/install.sh | sh
Related skills

More from lebsral/dspy-programming-not-prompting-lms-skills

Installs
4
GitHub Stars
5
First Seen
Apr 13, 2026