Jackye Clayton and John Baldino dig into how hiring bias manifests not just in human decision-making but deep inside the technology, processes, and systems organizations rely on every day. They reveal how applicant tracking systems create barriers for blue-collar workers, why AI tools learn and replicate manager bias over time, and how poorly written job descriptions quietly filter out top talent. Using real examples from construction, fine dining, and healthcare, they show why skills alone are not enough and why contextualizing skills within your specific environment is critical. This episode arms HR and TA professionals with practical strategies to audit their hiring process for hidden bias.
Key Takeaways:
- Applicant tracking systems that are not mobile-friendly create implicit bias against blue-collar and frontline workers
- AI screening tools learn hiring manager preferences over time and start dismissing qualified candidates who do not fit a narrowing pattern
- When hiring is down overall, compare people-group data proportionally before drawing conclusions about bias
- Resume parsers in major ATS platforms are still broken, making it unnecessarily difficult for candidates to apply
- Skills-based hiring needs context; a cook from a casual restaurant is not interchangeable with a fine-dining garde manger
- AI-generated job descriptions that lack specificity create a different form of bias through vagueness
- Organizations need regular audits of their AI and tech stack to catch bias that builds over six to twelve months
- Blind hiring can backfire by making reviewers unconsciously gravitate toward candidates with familiar backgrounds
- The drop-off rate between starting and completing job applications is staggeringly high across most ATS platforms
- Sometimes the best hiring happens when you bypass the system and let candidates talk directly to the hiring team
00:00 - Introduction and opening banter
16:15 - Setting up the bias in hiring conversation
18:26 - The tool that claimed to eliminate DEI and why it was inherently biased
19:34 - Using math correctly: comparing hiring declines proportionally across groups
21:49 - Look at who has left your organization to find where bias may exist
25:03 - Construction industry losing six workers for every one hired
30:04 - How ATS technology blocks blue-collar and frontline candidates
33:20 - Broken resume parsers in Workday, iCIMS, and major platforms
36:05 - LinkedIn application click-through vs. actual completion rates
40:01 - Fine dining example: why skills need environmental context
45:56 - AI learns hiring manager bias and self-affirms narrow patterns
49:10 - Why regular tech stack audits are essential to catch creeping bias
Keywords: hiring bias ATS, AI bias in recruiting, applicant tracking system barriers, skills-based hiring context, resume parser problems, blue-collar recruiting bias, tech stack audit hiring, blind hiring drawbacks, job description bias, frontline worker hiring
Powered by the WRKdefined Podcast Network.


