swift-missing-translations
swift-missing-translations
End-to-end loop: audit → translate → wire up Swift-side leaks → verify. Use when the user wants to localize an iOS app or hunt down strings that haven't been translated yet.
Step 0 — Reach shared understanding
Before running anything, derive what you can from the repo, then ask only what's left in a single batched message. Don't assume the source language.
Read first:
find . -name "*.xcstrings"→ catalog paths.- For each catalog:
sourceLanguagefield, set of locales used (union oflocalizationskeys). - UI source roots — typical:
Features/,Views/,App/,Components/,Screens/. - In-app language picker?
grep -rn "selectedLanguage\|@AppStorage.*lang"and look for\.localeinjected at the root view. Determines whetherreferences/wire-up.md§d applies. - Sample 3 translated keys with placeholders to learn existing tone and the project's placeholder convention.
Ask only when not derivable:
- Tone and audience (kid-friendly second-person, professional B2B, etc.). Critical, not in the repo.
- Adding a brand-new locale? Only if the user named one not in the catalog.
- Dev/TestFlight quality vs App Store quality (the latter wants a native-speaker review pass after).
More from timbroddin/skills
app-store-aso
Generate optimized Apple App Store metadata recommendations with ASO best practices. Use this skill when analyzing app listings, optimizing metadata (title, subtitle, description, keywords), performing competitive analysis, or validating App Store listing requirements. Triggers on queries about App Store optimization, metadata review, or screenshot strategy.
32youtube-research
Deep LLM-driven research over one or more YouTube channels' videos. Lists each channel's catalog, filters videos by topic relevance, transcribes only the relevant ones, then synthesizes a single cross-channel research document with timestamped citations. Use when the user runs /youtube-research or asks to research, summarize, analyze, compare, or extract topics from a YouTube channel, multiple channels, or a specific YouTube video. Subtitles-first via yt-dlp, falls back to local Whisper (mlx-whisper, whisper.cpp, or openai-whisper) for videos without subs. Uses a hidden workspace at ./.youtube-research/ for intermediate artifacts (channel indexes, transcripts) and writes the final research artifact to the current working directory. Asks the user before deleting the workspace at the end.
3research-yt
Deep LLM-driven research over one or more YouTube channels' videos. Lists each channel's catalog, filters videos by topic relevance, transcribes only the relevant ones, then synthesizes a single cross-channel research document with timestamped citations. Use when the user runs /research-yt or asks to research, summarize, analyze, compare, or extract topics from a YouTube channel, multiple channels, or a specific YouTube video. Subtitles-first via yt-dlp, falls back to local Whisper (mlx-whisper, whisper.cpp, or openai-whisper) for videos without subs. Uses a hidden workspace at ./.research-yt/ for intermediate artifacts (channel indexes, transcripts) and writes the final research artifact to the current working directory. Asks the user before deleting the workspace at the end.
1