crossref-lookup
Crossref Lookup
Use this skill for citation metadata work backed by the Crossref REST API.
Instructions
- Prefer this skill when the user needs DOI validation, title search, citation metadata, or bibliography auditing.
- Use the bundled CLI:
- In this repository:
skills/crossref-lookup/scripts/lookup - After installation:
~/.agents/skills/crossref-lookup/scripts/lookup
- In this repository:
- Choose the narrowest mode that matches the request:
--doifor validating or enriching one DOI--titlefor title-to-DOI discovery--validate-filefor one DOI per line--audit-bibliographyfor a bibliography file such as.bibor plain text
- Normalize DOI strings before interpreting failures.
- Acceptable raw forms include
10.xxxx/...,doi:10.xxxx/..., andhttps://doi.org/10.xxxx/...
- Acceptable raw forms include
- If the user has a contact email for polite-pool requests, pass it with
--email. - Treat Crossref as citation metadata, not full text.
- If exact abstract-page wording, final pagination, or publisher formatting matters, verify the shortlisted record on the publisher or DOI landing page.
- When title search returns multiple plausible records, keep the ambiguity explicit instead of selecting a match silently.
Quick Reference
| Task | Action |
|---|---|
| Validate DOI | skills/crossref-lookup/scripts/lookup --doi 10.1038/nature12373 |
| Search by title | skills/crossref-lookup/scripts/lookup --title "CRISPR-Cas9 genome editing" |
| Validate a DOI list | skills/crossref-lookup/scripts/lookup --validate-file dois.txt |
| Audit bibliography | skills/crossref-lookup/scripts/lookup --audit-bibliography refs.bib |
| Citation style | `--style apa |
| Write to file | --output crossref-report.txt |
| Polite-pool email | --email you@example.org |
Input Requirements
- Python 3 with network access
- One of:
- a DOI via
--doi - a title via
--title - a file path for
--validate-file - a bibliography file path for
--audit-bibliography
- a DOI via
- Optional:
--stylefor formatted citation output--outputfor saving the report--emailfor the Crossref user agent
Output
- DOI validation status and normalized DOI when
--doiis used - title, journal, year, and formatted citation when metadata is found
- ranked title-search candidates for
--title - summary counts plus invalid/error entries for file validation and bibliography audits
- optional output file if
--outputis set
Quality Gates
- The lookup mode matches the user request
- DOI inputs are normalized before treating them as invalid
- Ambiguous title matches are presented as candidates rather than a silent single answer
- Citation formatting uses the requested style when style matters
- The final answer distinguishes Crossref metadata from publisher full text
Examples
Example 1: Validate a DOI
skills/crossref-lookup/scripts/lookup --doi "10.1038/nature12373"
Example 2: Search by title
skills/crossref-lookup/scripts/lookup \
--title "CRISPR-Cas9 genome editing" \
--email you@example.org
Example 3: Audit a bibliography
skills/crossref-lookup/scripts/lookup \
--audit-bibliography refs.bib \
--output crossref-audit.txt
Troubleshooting
Issue: The DOI looks valid but Crossref says it is missing.
Solution: Normalize the DOI first and retry. If it still fails, report that Crossref did not return a record instead of assuming publisher error.
Issue: Title search returns multiple plausible matches.
Solution: Return the shortlist with DOI, journal, and year so the user can disambiguate.
Issue: Bibliography audit reports missing DOIs for many entries.
Solution: Treat that as a coverage gap, not proof that the citations are invalid. Crossref metadata may be incomplete for some records.
More from fmschulz/omics-skills
bio-protein-clustering-pangenome
Cluster proteins into orthogroups and derive pangenome matrices.
18bio-workflow-methods-docwriter
Generate reproducible Methods documentation from workflow run artifacts (Nextflow/Snakemake/CWL), including exact commands, versions, parameters, QC gates, and outputs.
15bio-gene-calling
Call genes and annotate basic features for prokaryotes, viruses, and eukaryotes.
13scientific-impact-assessment
Assess paper and journal impact using OpenAlex citation counts, optional Altmetric data, and curated journal impact-factor references. Use when comparing papers, journals, or literature shortlists by reach and influence.
1