Scroll Top

[avatar user=”Prasadh M S” size=”thumbnail” align=”left” /] Prasadh M S |

Bots & Algos: Half Truth on Hiring Bias

Meet ‘Tengai’, the job interview robot who won’t judge you – read the headline of a recent Business Story on BBC. I read it again… it just didn’t sound right! The usage of ‘who’ instead of ‘that’ seemed to be a typo. With ‘Who’ typically used to denote people and not objects…did BBC just elevate Bots to Human status?? Yes, I may be nitpicking on a typo, but guess am resisting the imagery of a bot that can judge me but has chosen not to! 😊 Tengai promises interviews without bias and any technology that claims a bias-free outcome leaves me wondering if they are overrated or just a classic half-truth!

The Pet Peeve

‘Hiring Bias’ has been HR Tech’s pet peeve for a while now. Interviewers have been labelled the problem child with a huge incurable bias. Bot builders swear by interviewer’s inability to carry out ‘unbiased’ hiring. With all fingers pointing at the problem child, the popular fix to eliminate bias seems to be to eliminate the interviewer! If not a complete elimination, the attempt is to at least distance the interviewer far enough from the interviewee to reduce the bias! 😊

An oft-quoted Gartner study on this subject identified 10 Cognitive Biases that affect Hiring. To HR Tech’s excitement, Gartner’s recommendation to neutralise Hiring Bias was more “HCM Technology”. The report summarised that organisations should establish data-driven decision making to mitigate the effect of bias. In short…it said throw more Technology and Kill the Bias! After all, hasn’t Technology often turned up as the magical stone to kill the complicated bird?

Ok…Let’s Engineer it!

Can’t deny the strange fact that fixing the Hiring Bias is now approached as a technology challenge rather than a HR challenge. And don’t techies love challenges which are technically not theirs? 😊 They’ve jumped neck deep en masse and two large schools of Engineering have emerged with contrasting philosophies to fix the Bias:

– one that proposes to replace the human element with data-neutral Robotic Process Automation. This school produces tools and bots that (claim to) do everything from writing bias-free JDs to triggering bias-free Welcome Mails!
– the second stream that proposes to anonymize hiring data to remove bias-inducing identifiers. This school churns out solutions that ‘neutralize’ hiring data at each stage of the process. From sourcing to screening to interviewing and selection, their tools just ride on anonymity and Blind Hiring.

Both schools have created a massive line-up with the likes of Debra, Mya, Olivia, Ari and a bunch of bots set to make recruitment bias-free in their own ways. Well don’t ask me about the gender bias in naming Bots!

Oops! Stalemate?

While both Schools remain convinced of their capabilities, there are blatant exposés of their Achilles’ heels. There are high stake use-cases and scenarios emerging that they just can’t seem to tame.

• Amazon shutdown its secret AI Recruiting Tool, as it showed bias against women and they couldn’t help it unlearn the Gender Bias.
• 65% Recruiters in a study declared that it is a struggle to meaningfully recruit with anonymised resumes.
• A HBR study revealed that minorities who “whitened” their résumés, by removing racial cues, got more interview callbacks than those who did not.

With failing Bots, resisting Recruiters and candidate manipulated data, Engineering a pure-tech remedy to hiring bias seems to be stepping into a stalemate.

Divide and Rule!

Every touchpoint between an interviewer and candidate or candidate’s data has been marked as a potential spot for unconscious bias. The seemingly sure-shot (a.k.a short sighted) fix is to distance the stakeholders. Introducing buffer layers between stakeholders helps moderate the exchanges and dilutes the bias. A huge arsenal of tools has hence been built on this Divide and Rule principle. There are options for every step in the recruitment workflow to either be tech-assisted or tech-validated or directly executed by a bot.
Tengai, for example, denies a direct connect between the interviewer and the candidate. It interviews the candidates, records their responses and converts them to text. The anonymized text transcript is what the recruiter gets to review and make the selection choices. That’s a quintessential Divide and Rule in action!

Time for a Pause

All said, it would be archaic to cry foul for everything in this space. Technology advancements are non-negotiable and no doubt keep us moving ahead on efficiencies. However, not maintaining a balance between human and artificial capabilities is a concern. Specific to Recruitment, solutions seeking to entirely automate may have to slowdown. Touting them as replacements for high cognitive human tasks will have to be carefully reviewed. Putting interviewers on standby, while Technology does the work is bound to backfire in the long-term.

Technology as a sole remedy for Hiring Bias is just another case of solving the symptoms and not the cause. If Hiring Bias is real, it’s time that humanities, social sciences, behavioral and management sciences took the lead on this.

After all, can we move the human out of human interactions?

Author

Leave a comment