AI can streamline hiring. Employers remain accountable for fair, transparent, merit-based decisions.
AI can speed up sourcing, screening, and scheduling. But under Singapore's Workplace Fairness Act (WFA) and Tripartite Guidelines on Fair Employment Practices (TGFEP), accountability stays with employers.
Your job is simple: widen the talent pool, assess on merit, and keep trust high. Use tech to improve workflow, not to pass off responsibility.
What fair hiring means
Fair hiring focuses on job-related requirements and applies them consistently to every candidate. That means assessing capability, not personal characteristics or irrelevant signals.
In Singapore, the WFA and TGFEP set the baseline for lawful, merit-based decisions. Follow both the legal requirements and the guiding principles to keep decisions defensible and fair.
Put it into action
Start with a job analysis. Define the essential functions, must-have skills, and nice-to-haves.
Embed these into the job description, job ad, screening criteria, interview scorecards, and final selection. Use competency-based questions and keep records of key decisions so choices stay consistent with the role.
Where AI helps-and where it can hurt
Teams are using AI to draft job descriptions, source and screen candidates, run video interviews, schedule meetings, and standardise communications. Done well, this reduces variability in repetitive tasks and improves throughput.
But if left unchecked, models can repeat bias from historical data, hide how scores are produced, or filter out strong candidates with non-traditional profiles. Keep AI tied to genuine job requirements and monitor its outputs.
For added assurance, refer to Singapore's model for responsible AI by IMDA and PDPC. It promotes fair, transparent, and explainable use of AI in decisions. Model AI Governance Framework
How to uphold fairness when using AI
- Use job-related data only. Feed the system with criteria pulled from your job analysis and description. Exclude personal characteristics and unrelated signals. Keep assessments tied to skills and the ability to perform the job.
- Keep people in control. Let AI inform, not decide. Recruiters and hiring managers must review and approve shortlists, interview outcomes, and offers. Document the rationale at each stage.
- Be open with candidates. Explain where AI is used (e.g., screening, scheduling) and confirm that final decisions are made by people. Provide a contact point for questions or appeals.
Practical checklist for HR teams
- Job analysis → screening matrix: Translate essential functions into clear must-have and nice-to-have criteria with weights and thresholds.
- Data minimisation: Remove age, gender, nationality, marital status, religion, photos, and other non-job data from inputs. Anonymise resumes where feasible.
- Vendor due diligence: Ask for documentation on data sources, features used, bias testing, human override, and model update cadence. Confirm storage location and PDPA compliance.
- Bias monitoring: Track pass-through rates by relevant groups at each stage. Investigate gaps and adjust criteria, tools, or processes when disparities appear.
- Human-in-the-loop controls: No auto-rejections near thresholds. Require manual review for borderline candidates and any exceptions.
- Versioning and logs: Keep a changelog of model settings, prompts, training data updates, and score thresholds. Retain audit trails of decisions and communications.
- Candidate communications: Use clear language, provide timelines, and share who to contact for queries or appeals.
Competency-based questions to keep interviews fair
- "Tell me about a time you solved a problem with incomplete information. What did you do and what changed?"
- "Describe a project where you had to influence stakeholders with different priorities. How did you approach it?"
- "Walk me through a mistake you made, how you handled it, and what you changed after."
- "Give an example of meeting a tight deadline. What trade-offs did you make and why?"
Documentation to keep (audit-ready)
- Job analysis, job description, and the screening matrix with weights
- Interview guides, scorecards, and panel notes
- AI tool configuration, version history, prompts, and change logs
- Candidate-stage reports and selection rationales
- Candidate communications (acknowledgements, invites, outcomes)
Bottom line
Keep fairness at the forefront and people at the heart of every hiring decision. Use AI to serve the process, not to replace judgment.
For guidance on fair and progressive practices, visit TAFEP. For governance standards that support fair, transparent, and explainable AI use, see the IMDA-PDPC framework.
If your HR team is building AI skills, explore role-specific learning paths here: Complete AI Training - Courses by Job.
Your membership also unlocks: