ai-tech-summary
AI Tech Summary
Core Goal
- Pull the right records and fields for a requested time range.
- Package evidence into a compact JSON context for RAG.
- Let the agent synthesize final summary text from retrieved evidence.
- Support daily, weekly, monthly, and custom time windows.
Triggering Conditions
- Receive requests for daily, weekly, or monthly digests.
- Receive requests for arbitrary date-range summaries.
- Need evidence-grounded summary output from RSS entries/fulltext.
- Need agent-generated summary style rather than rigid scripted report format.
Input Requirements
- Required tables in SQLite:
feeds,entries(fromai-tech-rss-fetch). - Optional table:
entry_content(fromai-tech-fulltext-fetch). - Shared DB path should be the same across all RSS skills.
- In multi-agent runtimes, set
AI_RSS_DB_PATHto one absolute DB path for this agent.
RAG Workflow
- Retrieve evidence context by time window.
export AI_RSS_DB_PATH="/absolute/path/to/workspace-rss-bot/ai_rss.db"
python3 scripts/time_report.py \
--db "$AI_RSS_DB_PATH" \
--period weekly \
--date 2026-02-10 \
--max-records 120 \
--max-per-feed 20 \
--summary-chars 8192 \
--fulltext-chars 8192 \
--pretty \
--output /tmp/ai-tech-weekly-context.json
- Load retrieval output and generate final summary in agent response.
- Read
query,dataset,aggregates,records. - Prioritize
recordsas evidence source. - Mention key trends, major events, and notable changes grounded in records.
- Include evidence anchors in summary.
- Reference
entry_id, feed, and URL for key claims. - If retrieval is truncated, state that summary is based on sampled top records.
Time Window Modes
--period daily --date YYYY-MM-DD--period weekly --date YYYY-MM-DD--period monthly --date YYYY-MM-DD--period custom --start ... --end ...- Time filtering is always based on
entries.first_seen_at(UTC).
Custom boundaries support both YYYY-MM-DD and ISO datetime.
Field Selection for RAG
- Use
--fieldsto control token budget and relevance. - Default fields are tuned for summarization:
entry_id,timestamp_utc,timestamp_source,feed_title,feed_url,title,url,summary,fulltext_status,fulltext_length,fulltext_excerpt
- Common minimal field set for tight context:
entry_id,timestamp_utc,feed_title,title,url,summary
Recommended Agent Output Pattern
- Use this order in final response:
- Time range scope
- Top themes/trends
- Key developments (grouped)
- Risks/open questions
- Evidence list (entry ids + URLs)
Configurable Parameters
--dbAI_RSS_DB_PATH(recommended absolute path in multi-agent runtime)--period--date--start--end--max-records--max-per-feed--summary-chars--fulltext-chars--top-feeds--top-keywords--fields--output--pretty--fail-on-empty
Error Handling
- Missing
feeds/entries: fail fast with setup guidance. - Invalid date/time/field list: return parse errors.
- Missing
entry_content: continue in metadata-only mode. - Empty retrieval set: return empty context; optionally fail with
--fail-on-empty.
References
references/time-window-rules.mdreferences/report-format.md
Assets
assets/config.example.json
Scripts
scripts/time_report.py
More from fadeloo/skills
email-imap-fetch
Listen for one or more IMAP inboxes with the IDLE command, fetch unread email metadata plus text previews, and forward each message to OpenClaw webhooks. Use when tasks need near-real-time mailbox monitoring, multi-account inbox ingestion via environment variables, and automatic trigger delivery into OpenClaw automation.
8ai-tech-fulltext-fetch
Fetch and persist article full text for RSS entries already stored in SQLite by ai-tech-rss-fetch. Use when backfilling or incrementally syncing body text from entries.url or entries.canonical_url into a companion table for downstream indexing, retrieval, or summarization.
8email-smtp-send
Send emails through SMTP with optional local attachments and optional IMAP APPEND sync to Sent mailbox. Use when tasks need reliable outbound email delivery, attachment sending, SMTP connectivity checks, or cross-client sent-mail visibility (for example appending to "Sent Items" after SMTP send).
7ai-tech-rss-fetch
Subscribe to AI and tech RSS feeds and persist normalized metadata into SQLite using mature Python tooling (feedparser + sqlite3). Use when adding feed URLs/OPML sources, running incremental sync with deduplication, and storing entry metadata without full-text extraction or summarization.
7sustainability-rss-fetch
Ingest all sustainability journal RSS entries into a dedicated RSS SQLite database first, keyed by DOI, then mark relevance and prune non-relevant rows to DOI-only. Use when building a DOI-first ingestion pipeline with mandatory full ingestion before topic filtering.
7sustainability-summary
Retrieve time-windowed relevant sustainability RSS evidence from the RSS metadata SQLite database and optionally join DOI-keyed enriched content from a separate fulltext SQLite database. Use when generating grounded daily, weekly, monthly, or custom-range digests after relevance labeling.
7