sustainability-summary
Sustainability Summary
Core Goal
- Read only relevant (
is_relevant=1) records from RSS DB. - Build compact RAG context from DOI-keyed entries.
- Include optional enriched content from separate fulltext DB
entry_contentwhen available. - Let the agent synthesize final summary text with evidence anchors.
Triggering Conditions
- Receive requests for daily/weekly/monthly sustainability digests.
- Receive requests for custom date-range summaries.
- Need evidence-grounded output from labeled RSS entries and enriched content.
Input Requirements
- Required tables:
feeds,entries(fromsustainability-rss-fetch). entriesmust be DOI-keyed and relevance-labeled.- Optional fulltext DB table:
entry_content(fromsustainability-fulltext-fetch). - RSS DB and fulltext DB must be different files.
Workflow
- Build retrieval context by time window.
export SUSTAIN_RSS_DB_PATH="/absolute/path/to/workspace-rss-bot/sustainability_rss.db"
export SUSTAIN_FULLTEXT_DB_PATH="/absolute/path/to/workspace-rss-bot/sustainability_fulltext.db"
python3 scripts/time_report.py \
--rss-db "$SUSTAIN_RSS_DB_PATH" \
--content-db "$SUSTAIN_FULLTEXT_DB_PATH" \
--period weekly \
--date 2026-02-10 \
--max-records 120 \
--max-per-feed 20 \
--summary-chars 8192 \
--fulltext-chars 8192 \
--pretty \
--output /tmp/sustainability-weekly-context.json
-
Generate final summary from returned
records+aggregates. -
Cite evidence using DOI + URL for key claims.
Time Window Modes
--period daily --date YYYY-MM-DD--period weekly --date YYYY-MM-DD--period monthly --date YYYY-MM-DD--period custom --start ... --end ...
Default Fields
doi,timestamp_utc,timestamp_source,feed_title,feed_url,title,url,summary,fulltext_status,fulltext_length,fulltext_excerpt
Configurable Parameters
--rss-db--content-dbSUSTAIN_RSS_DB_PATHSUSTAIN_FULLTEXT_DB_PATH--period--date--start--end--max-records--max-per-feed--summary-chars--fulltext-chars--top-feeds--top-keywords--fields--output--pretty--fail-on-empty
Error Handling
- Missing required DOI-based tables: fail fast with setup guidance.
- RSS DB and fulltext DB path collision: fail fast and require separate files.
- Invalid date/time/field list: return parse errors.
- Missing
entry_content: continue in metadata-only mode. - Empty relevant set: return empty context; optional failure with
--fail-on-empty.
References
references/time-window-rules.mdreferences/report-format.md
Assets
assets/config.example.json
Scripts
scripts/time_report.py
More from fadeloo/skills
email-imap-fetch
Listen for one or more IMAP inboxes with the IDLE command, fetch unread email metadata plus text previews, and forward each message to OpenClaw webhooks. Use when tasks need near-real-time mailbox monitoring, multi-account inbox ingestion via environment variables, and automatic trigger delivery into OpenClaw automation.
8ai-tech-fulltext-fetch
Fetch and persist article full text for RSS entries already stored in SQLite by ai-tech-rss-fetch. Use when backfilling or incrementally syncing body text from entries.url or entries.canonical_url into a companion table for downstream indexing, retrieval, or summarization.
8ai-tech-summary
Retrieve time-windowed RSS evidence from SQLite and let the agent produce final summaries using RAG over selected records and fields. Use when generating daily, weekly, monthly, or custom-range AI tech digests directly in agent responses instead of fixed template reports.
7email-smtp-send
Send emails through SMTP with optional local attachments and optional IMAP APPEND sync to Sent mailbox. Use when tasks need reliable outbound email delivery, attachment sending, SMTP connectivity checks, or cross-client sent-mail visibility (for example appending to "Sent Items" after SMTP send).
7ai-tech-rss-fetch
Subscribe to AI and tech RSS feeds and persist normalized metadata into SQLite using mature Python tooling (feedparser + sqlite3). Use when adding feed URLs/OPML sources, running incremental sync with deduplication, and storing entry metadata without full-text extraction or summarization.
7sustainability-rss-fetch
Ingest all sustainability journal RSS entries into a dedicated RSS SQLite database first, keyed by DOI, then mark relevance and prune non-relevant rows to DOI-only. Use when building a DOI-first ingestion pipeline with mandatory full ingestion before topic filtering.
7