openrouter-usage
OpenRouter Usage
Query OpenRouter API costs and generate usage reports via the command line.
When to Use
- Checking OpenRouter spend over the last 30 days
- Breaking down costs by model, provider, or date
- Exporting usage data as CSV or JSON for accounting
- Auditing which models and API keys consume the most credits
Prerequisites
OPENROUTER_API_KEYenvironment variable (or in a project.envfile)- The key must be a Management key — create one at https://openrouter.ai/settings/management-keys
- Regular provisioning keys return 401/403 on analytics endpoints
Usage
# Full 30-day summary with credits balance
python3 ~/.claude/skills/openrouter-usage/scripts/openrouter_usage.py
# Filter to a single date
python3 ~/.claude/skills/openrouter-usage/scripts/openrouter_usage.py --date 2026-02-01
# Export as CSV (for spreadsheets)
python3 ~/.claude/skills/openrouter-usage/scripts/openrouter_usage.py --csv
# Raw JSON output (for piping to jq)
python3 ~/.claude/skills/openrouter-usage/scripts/openrouter_usage.py --json
# Skip credits lookup (activity only)
python3 ~/.claude/skills/openrouter-usage/scripts/openrouter_usage.py --no-credits
Output
The default report includes:
- Credits balance: total purchased, used, remaining
- Spend by model: per-model cost with token counts (prompt, completion, reasoning)
- Spend by provider: aggregated by provider (OpenAI, Anthropic, Google, etc.)
- Spend by date: daily spend with visual bar chart
API Endpoints
Both endpoints are read-only GETs authenticated via Bearer token:
| Endpoint | Purpose |
|---|---|
GET /api/v1/credits |
Total credits purchased and used |
GET /api/v1/activity |
Last 30 days of usage grouped by model/endpoint/provider |
The activity endpoint returns per-row: date, model, provider_name, usage, byok_usage_inference, requests, prompt_tokens, completion_tokens, reasoning_tokens.
More from tdimino/claude-code-minoan
academic-research
Search academic papers, build literature reviews, and synthesize research findings — combines Exa MCP (research_paper category, arxiv filtering) with arxiv-mcp-server for paper discovery, download, and deep analysis. Triggers on academic paper, literature review, research synthesis, arxiv, find papers, scholarly search.
70travel-requirements-expert
Plan a trip, create an itinerary, or research a destination through a structured 5-phase workflow---discovery questions, Exa/Firecrawl research, expert detail gathering, and a day-by-day requirements spec. This skill should be used when a user says "plan a trip," "create an itinerary," "help me visit [place]," or needs travel research with specific venues, safety protocols, and dietary accommodations.
67twilio-api
Use this skill when working with Twilio communication APIs for SMS/MMS messaging, voice calls, phone number management, TwiML, webhook integration, two-way SMS conversations, bulk sending, or production deployment of telephony features. Includes official Twilio patterns, production code examples from Twilio-Aldea (provider-agnostic webhooks, signature validation, TwiML responses), and comprehensive TypeScript examples.
65figma-mcp
Convert Figma designs into production-ready code using MCP server tools. Use this skill when users provide Figma URLs, request design-to-code conversion, ask to implement Figma mockups, or need to extract design tokens and system values from Figma files. Works with frames, components, and entire design files to generate HTML, CSS, React, or other frontend code.
61firecrawl
Scrape web pages to clean markdown using Firecrawl v2 — handles JS-heavy pages, site crawls, URL mapping, document parsing (PDF/DOCX/XLSX), LLM-powered extraction, autonomous agent scraping, and post-scrape browser interaction (Interact API). Prefer over WebFetch for quality and completeness. Triggers on scrape URL, fetch page, crawl site, extract content, parse document, web to markdown, DeepWiki, Firecrawl.
51scrapling
Scrape pages locally with anti-bot bypass, TLS impersonation, and adaptive element tracking — no API keys, no cloud. Handles Cloudflare protection, CSS/XPath element extraction, and survives site redesigns. Complements firecrawl (cloud) with 100% local execution. Triggers on Cloudflare bypass, anti-bot scraping, stealth fetch, local scraping, Scrapling.
47