Job Seekers Demand Transparency and Fairness in Automated Recruitment Decisions
Job seekers demand transparency in automated recruitment, wanting clear info on how decisions are made and assurance of human oversight. Fairness and bias concerns remain top priorities.

Job Seekers Demand Transparency in Automated Decision-Making for Recruitment
New research by the Information Commissioner’s Office (ICO) reveals that job candidates want clear transparency when automated decision-making (ADM) tools are used in recruitment. Candidates want to know exactly when and how these systems are applied, what data is processed, and how decisions are made.
As AI increasingly handles tasks like screening CVs, conducting initial interviews, and assessing candidate fit, concerns about fairness, transparency, and accountability have come to the forefront. The ICO’s study, conducted by Revealing Reality, focused on public attitudes toward ADM in recruitment and how individuals experience these technologies.
Key Findings from the Research
The study involved remote focus groups with 33 job seekers representing diverse employment backgrounds. Many participants were aware that ADM is widely used but had limited understanding of its workings. They often viewed decision-making as either fully human or fully automated, without recognizing hybrid approaches.
- Transparency is critical: Participants emphasized the need to be informed when ADM tools are in use during recruitment.
- Human oversight remains essential: While ADM can improve efficiency, candidates expect humans to oversee decisions to ensure fairness and address bias.
- Bias concerns persist: Many worry ADM may reinforce existing societal biases. Fairness and non-discrimination are top expectations for these systems.
- Candidate experience affects perceptions: Poor communication, vague feedback, and impersonal recruitment tasks (like games) negatively shape attitudes toward ADM.
- ADM’s role should be context-dependent: Light use for initial filtering is acceptable, but fully automated or comprehensive assessments raise significant concerns.
Candidate Reactions to ADM Tools
Some participants shared frustrations about recruitment tests and games that seemed unrelated to job skills. For example, a graduate recalled a game involving pumping up balloons to avoid bursting, questioning its relevance to the role.
Many believed they had encountered ADM-driven automated rejections but noted a lack of transparency from employers. Identical rejection emails and quick turnaround times were seen as signs of automation, yet no clear information was provided upfront. This absence of openness was a common complaint.
Implications for HR Professionals
For HR teams, these insights highlight the need to communicate clearly about ADM use in recruitment. Candidates want transparency about the role of automation and assurance that decisions are fair and overseen by humans.
Addressing bias in ADM systems is crucial, especially to protect marginalized groups who may be disproportionately impacted. Providing meaningful feedback and improving candidate communication can help build trust and improve the recruitment experience.
Many candidates accept ADM for early-stage filtering but expect human judgment for assessments and final decisions. Balancing efficiency with fairness and empathy should be a priority for recruitment strategy.
Looking Ahead
The ICO plans to consult on updates to its ADM and profiling guidance and develop a statutory code of practice on AI and ADM within the next year. This will shape standards and best practices for using these technologies in recruitment.
HR professionals interested in understanding AI tools and ethical recruitment practices can find relevant training and resources at Complete AI Training.