Army adopts AI to pre-screen promotion candidates: Practical takeaways for HR
The Army is introducing artificial intelligence to help pre-screen promotion candidates, starting with non-commissioned officers and potentially expanding to commissioned officers. The goal is clear: reduce the volume of records boards must review so human judgment is focused where it matters most. Leaders emphasized that humans can override AI decisions and that bias safeguards are part of the design.
For HR teams, this is a relevant blueprint: use AI to triage, keep people in the loop for judgment calls, and build governance to prevent bias.
What the Army said
At a recent conference, Col. Tom Malejko, chief talent analytics officer, posed the essential question: "Can we screen out individuals that are not really competitive for the process up front and then help our board members to focus their valuable time and resources on those individuals that are then most competitive for that selection?" He also noted that service members will be able to override AI-driven decisions.
Army leaders said boards face thousands of records per cycle and confirmed they've used algorithms to screen candidates for four years. Reports also indicate AI is being explored in recruitment.
Bias controls and human oversight
- Elimination criteria will not be based on race, ethnicity, rank, or branch.
- Humans retain final say with explicit override capability.
- Screening focuses on discrete, job-relevant criteria to narrow the pool before deeper review.
Translate this to your HR process
- Define "uncompetitive" with transparent, job-related thresholds (skills, performance bands, time-in-role, certifications).
- Use AI for first-pass screening; reserve human review for borderline cases, exceptions, and final decisions.
- Implement an override and appeal path. Document every override to improve the model and policy.
- Run bias and adverse impact testing on every model update and every hiring/promotion cycle.
- Audit data quality (missing fields, inconsistent ratings) before you automate decisions.
- Track KPIs: time-to-decision, quality of hire/promotion, adverse impact ratio, false positives/negatives.
- Pilot with a single role or level, compare against a control group, and only scale if results hold.
- Create governance: model owners, reviewers, retraining cadence, documentation, and incident response.
- Align with legal guidance and ethics standards; keep documentation ready for internal and external audits.
Helpful resources
Implementation checklist for HR leaders
- Write down your selection criteria and weightings; make them auditable.
- Choose explainable models where feasible; require reason codes for every screen-out.
- Calibrate with your promotion board or hiring panel to align model thresholds with human standards.
- Set holdouts and conduct periodic backtesting to prevent performance drift.
- Communicate with candidates: what's automated, what's human-reviewed, and how to appeal.
- Train HRBPs and managers on responsible AI use, bias testing, and documentation.
Why this matters for HR
High-volume selection overwhelms reviewers and dilutes attention from truly competitive candidates. The Army's approach shows a practical path: let software handle volume, keep humans accountable for judgment, and build fairness into the process from day one.
Equip your team
- Upskill your HR team on AI in selection, bias testing, and governance with focused programs: AI courses by job.
- Stay current on tools and methods: Latest AI courses.
Bottom line: use AI to narrow the pool, keep humans responsible for the decision, prove fairness with data, and iterate with clear governance.
Your membership also unlocks: