AI Slop Is Spurring Record Requests for Imaginary Journals
Never heard of the Journal of International Relief or the International Humanitarian Digital Repository? They don't exist. Yet popular AI assistants are pointing researchers to them, according to a warning from the International Committee of the Red Cross (ICRC), which runs widely used research archives.
ChatGPT, Gemini, Copilot, and similar tools are generating incorrect or fabricated archival references. The result: students, researchers, and archivists spend hours chasing ghosts.
Why this matters for research teams and libraries
The Library of Virginia estimates that 15 percent of emailed reference questions it receives are now AI-generated, including citations to both published works and unique primary sources that aren't real. "For our staff, it is much harder to prove that a unique record doesn't exist," says Sarah Falls, the library's chief of researcher engagement.
Beyond inconvenience, this erodes trust in reference workflows. It also shifts time from real research to verifying fabrications.
What's going wrong
Large language models predict text. When asked for citations, they often produce plausible titles, journal names, and collection identifiers that fit the pattern of a real reference-but aren't. Confident tone + invented details = wasted time.
Verify first, then proceed: a fast checklist
- Start with authoritative catalogs. For humanitarian and conflict collections, search the ICRC Archives and library catalogs before anything else. ICRC Archives
- Require identifiers. Look for DOI, ISSN, ISBN, call number, collection name, series, box/folder, or accession ID. Missing core elements is a red flag.
- Validate journal and article claims. Use an authoritative registry to check journal existence and article metadata. Crossref
- Trace to a record. You should be able to land on a publisher page, repository record, catalog entry, or finding aid. Screenshots or quotes without a source don't count.
- Cross-check names. Confirm author identities via institutional pages or ORCID, and verify that their publication lists include the cited work.
- Be wary of "archive-sounding" labels. Fake sources often mimic official-sounding repositories. If you can't locate a home institution, catalog, or contact, treat it as suspect.
How to prompt AI without creating more noise
- Ask for verifiable sources only: "Cite only items that have a DOI or a direct catalog/finding-aid URL. If none exist, say you cannot verify."
- Force uncertainty: "If you are not sure a source exists, state 'uncertain' instead of guessing."
- Request structured output with identifiers first: DOI/URL, then title, authors, year, journal, and publisher. Reject entries missing identifiers.
- Never copy AI citations into emails to librarians until you've independently verified them.
A quick workflow you can adopt today
- Step 1: Quote-search the exact title. No hits from publishers, catalogs, or repositories = suspect.
- Step 2: Check the journal. Confirm the journal exists (scope, ISSN, publisher) and that it actually covers the topic and time frame claimed.
- Step 3: Look up the author. Verify affiliation and publication list; see if the work appears anywhere credible.
- Step 4: For archives, demand full citation elements: collection name, series, box/folder, call number, repository. If these are missing, pause.
- Step 5: Keep a log of verified vs. rejected references to prevent repeat chasing.
Library policies are shifting
The ICRC advises researchers to use online catalogs and references found in existing published scholarship rather than assuming any AI-cited source is real-no matter how authoritative it sounds. The Library of Virginia plans to ask patrons to vet sources and disclose when AI was used, and to limit staff time spent verifying questionable items.
Research groups should mirror that approach: require verification before requests go to librarians and archivists, and add a disclosure line when AI assisted the search.
What to include in requests to archives and libraries
- Your research question in one or two sentences.
- Full citation details you have (identifiers first).
- Where you looked: catalogs, databases, publisher sites.
- Whether AI was involved, and which parts you have already verified.
- A clear ask (locate, confirm existence, suggest alternatives) and your deadline.
Bottom line
AI can speed up brainstorming, but it's a poor source of record. Treat every AI citation as unverified until it resolves to a DOI, catalog record, or finding aid. Protect your time-and your librarians' time-by verifying first.
If you need practical guidance on prompting for verifiable outputs and reducing hallucinations, see these resources: Prompt Engineering.
Your membership also unlocks: