browsing-bluesky
Browsing Bluesky
Access Bluesky content through public APIs and real-time firehose. Supports optional authentication for personalized feeds. Includes account analysis for categorization.
Implementation
Add skill directory to path and import:
import sys
sys.path.insert(0, '/path/to/skills/browsing-bluesky') # or use .claude/skills symlink path
from browsing_bluesky import (
# Core browsing
search_posts, get_user_posts, get_profile, get_feed_posts, sample_firehose,
get_thread, get_quotes, get_likes, get_reposts,
get_followers, get_following, search_users,
# Trending
get_trending, get_trending_topics,
# Account analysis
get_all_following, get_all_followers, extract_post_text,
extract_keywords, analyze_account, analyze_accounts,
# Authentication utilities
is_authenticated, get_authenticated_user, clear_session
)
Authentication (Optional)
Authentication enables personalized feeds (like Paper Skygest) that require knowing who's asking.
Setup
- Create an app password at Bluesky: Settings → Privacy and Security → App Passwords
- Set environment variables:
export BSKY_HANDLE="yourhandle.bsky.social" export BSKY_APP_PASSWORD="xxxx-xxxx-xxxx-xxxx"
Behavior
- Transparent: All functions work identically with or without credentials
- Automatic: Auth headers are added opportunistically when credentials exist
- Graceful: Failed auth silently falls back to public access
- Secure: Tokens cached in memory only, never logged or persisted
Check Auth Status
if is_authenticated():
print(f"Logged in as: {get_authenticated_user()}")
else:
print("Using public access")
# Clear session if needed (e.g., switching accounts)
clear_session()
Research Workflows
Investigate a Topic
Use search_posts() with query syntax matching bsky.app advanced search:
- Basic terms:
event sourcing - Exact phrases:
"event sourcing" - User filter:
from:acairns.co.ukor useauthor=param - Date filter:
since:2025-01-01or usesince=param - Hashtags, mentions, domain links:
#python mentions:user domain:github.com
Combine query syntax with function params for complex searches.
Monitor a User
- Fetch profile with
get_profile(handle)for context (bio, follower count, post count) - Get recent posts with
get_user_posts(handle, limit=N) - For topic-specific user content, use
search_posts(query, author=handle)
Discover What's Trending
Recommended workflow — trending API first, firehose for deep dives:
1. Quick scan with trending topics (~500 tokens)
topics = get_trending_topics(limit=10)
# Returns: {topics: [{topic, display_name, description, link}, ...],
# suggested: [...]}
2. Rich trends with post counts and actors
trends = get_trending(limit=10)
for t in trends:
print(f"{t['display_name']} — {t['post_count']} posts ({t['status']})")
# Each trend includes: topic, display_name, link, started_at,
# post_count, status, category, actors
3. Targeted exploration of selected trends
posts = search_posts(trend["topic"], limit=25)
4. Optional: Firehose for velocity monitoring or long-tail discovery
Prerequisites: Install Node.js dependencies once per session:
cd /home/claude && npm install ws https-proxy-agent 2>/dev/null
data = sample_firehose(duration=30) # Full firehose sample
data = sample_firehose(duration=20, filter="python") # Filtered sample
Returns dict with keys:
- window:
{startTime, endTime, durationSeconds}— sampling time range - stats:
{totalReceived, totalPosts, postsPerSecond, filter, languages}— volume metrics and language breakdown - topWords:
[[word, count], ...]— top 50 words (count >= 3) - topPhrases:
[[bigram, count], ...]— top 30 bigrams (count >= 2) - topTrigrams:
[[trigram, count], ...]— top 20 trigrams (count >= 2) - entities:
[[entity, count], ...]— top 25 handles/hashtags (count >= 2) - samplePosts:
[{text, altTexts, hasImages}, ...]— first 50 matching posts
Read Feeds and Lists
get_feed_posts() accepts:
- List URLs:
https://bsky.app/profile/austegard.com/lists/3lankcdrlip2f - Feed URLs:
https://bsky.app/profile/did:plc:xxx/feed/feedname - AT-URIs:
at://did:plc:xxx/app.bsky.graph.list/xyz
The function extracts the AT-URI from URLs automatically.
Explore a Thread
Fetch full thread context for a post with parents and replies:
thread = get_thread("https://bsky.app/profile/user/post/xyz", depth=10)
# Returns: {post: {...}, parent: {...}, replies: [...]}
Find Quote Posts
Discover posts that quote a specific post:
quotes = get_quotes("https://bsky.app/profile/user/post/xyz")
for q in quotes:
print(f"@{q['author_handle']}: {q['text'][:80]}")
Analyze Engagement
Get users who engaged with a post:
likes = get_likes(post_url)
reposts = get_reposts(post_url)
# Accepts both URLs and AT-URIs
likes = get_likes("at://did:plc:.../app.bsky.feed.post/...")
Explore Social Graph
Navigate follower/following relationships:
followers = get_followers("handle.bsky.social")
following = get_following("handle.bsky.social")
# Returns list of actor dicts with handle, display_name, did, description, etc.
Find Users
Search for users by name, handle, or bio:
users = search_users("machine learning researcher")
for u in users:
print(f"{u['display_name']} (@{u['handle']}): {u['description'][:100]}")
API Endpoint Notes
- Public AppView:
https://api.bsky.app/xrpc/for unauthenticated reads - PDS:
https://bsky.social/xrpc/for authenticated requests - Trending:
app.bsky.unspecced.getTrends(rich) andapp.bsky.unspecced.getTrendingTopics(lightweight) - Firehose:
wss://jetstream1.us-east.bsky.network/subscribe - Endpoint routing is automatic - authenticated requests go to PDS, public requests go to AppView
- Rate limits exist but are generous for read operations
Return Format
All API functions return structured dicts with:
uri: AT protocol identifiertext: Post contentcreated_at: ISO timestampauthor_handle: User handleauthor_name: Display namelikes,reposts,replies: Engagement countslinks: Full URLs extracted from post facets (post text truncates URLs with "...")image_alts: Alt text from embedded imagesurl: Direct link to post on bsky.app
Profile function returns: handle, display_name, description, followers, following, posts, did
Account Analysis
Analyze accounts for categorization by topic. Fetches profile and posts, extracts keywords, and returns structured data for Claude to categorize.
Analyze a User's Network
# Analyze accounts you follow
results = analyze_accounts(following="yourhandle.bsky.social", limit=50)
# Analyze your followers
results = analyze_accounts(followers="yourhandle.bsky.social", limit=50)
# Analyze specific handles
results = analyze_accounts(handles=["user1.bsky.social", "user2.bsky.social"])
Single Account Analysis
analysis = analyze_account("user.bsky.social")
# Returns: {handle, display_name, description, keywords, post_count, followers, following}
Keyword Extraction Options
Stopwords parameter filters domain-specific noise:
"en": English (general purpose, default)"ai": AI/ML domain (filters tech boilerplate)"ls": Life Sciences (filters research methodology)
results = analyze_accounts(following="handle", stopwords="ai")
Requires: extracting-keywords skill with YAKE venv for keyword extraction.
Filtering Accounts
results = analyze_accounts(
following="handle",
exclude_patterns=["bot", "spam", "promo"] # Skip accounts matching these
)
Paginated Following/Followers
For large account lists beyond the 100 limit of get_following/get_followers:
all_following = get_all_following("handle", limit=500) # Handles pagination
all_followers = get_all_followers("handle", limit=500)
Account Analysis Output
Each analyzed account returns:
{
"handle": "user.bsky.social",
"display_name": "User Name",
"description": "Bio text here",
"keywords": ["keyword1", "keyword2", "keyword3"],
"post_count": 20,
"followers": 1234,
"following": 567
}
Claude uses bio + keywords to categorize accounts by topic without hardcoded rules
More from oaustegard/claude-skills
developing-preact
Specialized Preact development skill for standards-based web applications with native-first architecture and minimal dependency footprint. Use when building Preact projects, particularly those involving data visualization, interactive applications, single-page apps with HTM syntax, Web Components integration, CSV/JSON data parsing, WebGL shader visualizations, or zero-build solutions with vendored ESM imports.
104reviewing-ai-papers
Analyze AI/ML technical content (papers, articles, blog posts) and extract actionable insights filtered through enterprise AI engineering lens. Use when user provides URL/document for AI/ML content analysis, asks to "review this paper", or mentions technical content in domains like RAG, embeddings, fine-tuning, prompt engineering, LLM deployment.
79exploring-codebases
>-
63mapping-codebases
Generate navigable code maps for unfamiliar codebases. Extracts exports/imports via AST (tree-sitter) to create _MAP.md files per directory showing classes, functions, methods with signatures and line numbers. Use when exploring repositories, understanding project structure, analyzing unfamiliar code, or before modifications. Triggers on "map this codebase", "explore repo", "understand structure", "what does this project contain", or when starting work on an unfamiliar repository.
48accessing-github-repos
GitHub repository access in containerized environments using REST API and credential detection. Use when git clone fails, or when accessing private repos/writing files via API.
43remembering
Advanced memory operations reference. Basic patterns (profile loading, simple recall/remember) are in project instructions. Consult this skill for background writes, memory versioning, complex queries, edge cases, session scoping, retention management, type-safe results, proactive memory hints, GitHub access detection, autonomous curation, episodic scoring, and decision traces.
41