AI wrote the cover letter. AI ran the interview. Both underperform. Here's how HR fixes it.
More employers are using AI to source, screen, and interview. More candidates are using AI to apply. Yet hiring outcomes are getting worse: lower signal, higher noise, and frustrated people on both sides.
Researchers reviewing tens of thousands of applications found that after large language models went mainstream, cover letters got longer and cleaner-but less useful. Employers started ignoring them, hiring rates slipped, and starting wages fell. That's what happens when everyone optimizes the same surface area: differentiation dies.
The signal problem: polished sameness
AI makes average candidates look above average on paper. It also makes strong candidates look average. When everything reads the same, you can't tell who's real.
If you still treat cover letters as a key signal, you're rewarding formatting, not capability. Shift weight to proof of work: job-relevant tasks, work samples, and structured screens that actually predict performance.
The interview problem: automation without judgment
As application volume spikes, teams lean on AI-led interviews. That saves time but doesn't remove bias by default. Algorithms can copy and even magnify the blind spots in the data they're trained on. Accessibility can suffer too, especially with asynchronous video tools.
Use AI to summarize, schedule, and transcribe. Keep decisions human. Standardize questions, score against evidence, and require human review before reject decisions. Offer accommodations and alternative formats for anyone who needs them.
Break the negative cycle
Right now, candidates automate to "beat the bot." Employers tighten filters to cope with volume. Both sides lose. HR can reset the game with clearer signals and simpler processes.
- Kill weak signals: Cap or remove cover letter scoring. Prioritize skills-based screens and work samples tied to the job.
- Simplify the application: Shorten forms. Remove duplicate fields. Fewer clicks = better data.
- Structure everything: Consistent question banks, anchored scorecards, and pass/fail criteria defined upfront.
- Human-in-the-loop: No automated rejections without human spot checks. Require escalation paths for edge cases.
- Accessibility by default: Offer text-based alternatives to video. Provide captions, extended time, and clear contact for accommodations.
- Bias checks that matter: Monitor pass-through rates, adverse impact ratio, and false positives/negatives at each stage.
- Data hygiene: Minimize collection. Set retention limits. Avoid features like facial expression or voice "analysis."
- Vendor diligence: Ask for validation studies, bias testing methods, model update cadence, and right-to-audit clauses.
Compliance isn't optional
Several states are moving on AI-in-hiring standards. Federal anti-discrimination laws still apply, even if a vendor's tool made the call. Lawsuits are emerging around accessibility and fairness in automated interviews.
- EEOC resources on AI in employment can guide disclosures, impact testing, and accommodations.
- Adopt an internal control framework such as the NIST AI Risk Management Framework to formalize risk reviews and documentation.
A practical 30-day plan
- Week 1: Inventory every AI-assisted step from sourcing to offer. Map decisions, data sources, and who owns them. Freeze high-risk features (e.g., facial analysis) pending review.
- Week 2: Write candidate notices, add an AI-use FAQ, and publish an accommodations process. Set data retention and deletion timelines.
- Week 3: Validate your skills tests. Run bias and accuracy checks on screening thresholds. Add human review gates for auto-rejects.
- Week 4: Train recruiters and hiring managers on structured interviewing and scorecards. Pilot a skills-first funnel for one role and compare outcomes.
What to stop, start, and keep
- Stop: Overweighting cover letters. Blind trust in vendor claims. Auto-rejections without human checks.
- Start: Skills-based screens, clearer job criteria, transparent candidate communication, and regular bias audits.
- Keep: Human judgment where it matters: final screens, calibrations, and exceptions.
If your team needs upskilling
If you want your recruiters and hiring managers to get fluent with AI tools without losing judgment, explore practical courses by role and skill here: AI courses by job.
Bottom line
AI is useful as an assistant, not a gatekeeper. Reduce noise, strengthen real signals, and keep a human accountable for the decision. That's how HR gets faster hiring without losing fairness-or great talent.
Your membership also unlocks: