How to Spot—and Stop—Unconscious Bias in Hiring
In early 2022, Eniitan Fagbola joined Toronto tech start-up ThinkData Works as a talent acquisition lead. The data company’s client base was growing and it needed to scale-up and hire more staff. Fagbola’s job was to help recruit the best candidates.
Fagbola asked leadership, hiring managers and HR about their interview processes. “There were none,” she laughs. Some interviewers took notes, others didn’t. There were no best-practice guidelines to follow. “It was chaotic, and it didn’t need to be,” she says. She wanted to make changes in order to ensure candidates had similar interview experiences across the board.
Research shows that a standardized interview process actually helps to reduce unconscious bias—prejudices that are formed without the interviewer even realizing it. Though often unintentional, the brain kicks into rapid-fire decision making that can result in unfair favouring or discrimination against candidates. By asking all candidates the same questions—which should be skills-based and clearly linked to the duties of the job—hiring managers can make decisions based on qualifications, rather than personal biases, like having attended the same university. Examples of effective questions can include “Describe a time that you worked independently to finish a task on deadline?” and “Tell me about a time you failed. How did you deal with the situation and what did you learn?” Standardizing the process is one way to give candidates the best shot at landing a job.
Leadership gave Fagbola free rein to update the hiring process to make it more equitable and streamlined. She wanted to make sure the company makeup was reflective of the community it operated in. She went to work, drawing on her past experience overhauling processes in campus recruitment for Deloitte UK.
Asking “Do you have children?” seems innocent enough, but making a hiring decision based on family status is a no-go
A best practice guide was circulated to everyone at the company, outlining new methods to reduce unconscious bias, like the rule that two interviewers should evaluate a candidate together, rather than solo. Alone, they can form their own opinions and biases, like affinity bias—a tendency to feel like there’s a natural connection with people who are similar to us. In pairs, it’s less likely for one interviewer to preemptively dismiss a candidate in their 50s by assuming they’re overqualified for the role (perception bias), or take a shine to a candidate who is from the same hometown as them. Two brains are better at minimizing unconscious bias, because interviewers can later compare notes and focus on the candidate’s skills.
Fagbola also made detailed note-taking mandatory. Interviewers were instructed to write down experiences or attributes they shared with the interviewee in an effort to spot their biases. Fagbola, for example, would note when she interviewed a Black woman or an immigrant to Canada, like herself. This let her later revisit her notes to consider whether bias crept in: Did I really like this candidate, or was there a bias because they’re like me? If an interviewer notices a trend in their notes, like consistently favouring candidates who play hockey, Fagbola says extra training and brushing up on interview skills can help reduce bias.
Fagbola’s best practice guide has a section called, “You Can’t Ask That.” Age, family status, race, religion and sexual orientation are all off-limits. Asking “Do you have children?” seems innocent enough, but making a hiring decision based on family status is a no-go; the Canadian Human Rights Act says it’s illegal for a federal-sector employer to ask candidates about their age, gender identity, sexual orientation, family status, race, religion and mental or physical disabilities.
Standard interview questions were also overhauled. Fagbola noticed interviewers asked broad questions like, “Tell me about yourself.” These are wholly unhelpful, especially for neurodivergent folks who may be more comfortable answering a straightforward question. A more direct, specific prompt, like “What would you do if a task was not up to standard, but the deadline to complete it had passed?” will yield more relevant answers. (And a candidate is less likely to nervously ramble on about their love for basketball or baking.)
ThinkData Works’ job postings were updated to have more inclusive language, too. They directly encourage those who have been out of the workforce for extended time periods to apply regardless: “We know there may be gaps in your resume or ‘non-traditional employment,’ however, we know life happens and invite you to apply.” The goal was to make the application process as welcoming as possible to those who left the workforce to raise children, new immigrants and those who were simply burnt out during the pandemic. “People should be allowed to take breaks or time off,” Fagbola says.
They also stopped asking for traditional cover letters. Having read thousands of these letters over her career, Fagbola doesn’t believe they’re the most valuable part of a job application. Instead, candidates are asked specific questions in a written application related to the role they’re applying for, like “Why do you want this job?” and “Is there anything not on your resume that you want to tell me about?” A data engineer may be asked something like, “Tell me about a past project and some of the technology you’ve used to execute it.” These answers are passed along to Fagbola through an applicant-tracking system.
“We tell our interview teams: Let the candidate know there is space for them here”
Recruitment bias costs money. More than three-quarters of senior managers admit to hiring the wrong candidate for the role, according to recruitment agency Robert Half. It takes an average of 11 weeks to realize the person was a poor match, and five more to restaff it. That idle time adds up; some estimate the average cost of a poor hiring decision is at least 30 per cent of the person’s first-year expected earnings.
Since implementing these changes, ThinkData Works has seen a seven per cent increase in new hires from underrepresented groups. Fagbola’s hires have been a 50:50 man to woman ratio, bucking the stereotypical tech trend of white, cisgender men. According to research published by Deloitte, women only make up about 33 per cent of the workforce in global tech firms.
For recruiters looking to usher in more equitable interview processes, Fagbola suggests moving away from the idea of “culture fit”—the likelihood a candidate will adapt to the values and collective behaviours at an organization. A focus on fit may lead hiring managers to pick someone who is similar to them, or looks like them, rather than choosing a candidate based on their competence. Fagbola says candidates should be assessed on “culture add,” which is how they will actually enhance the work culture through their skill set or an intersectional perspective that rounds out the team.
“We tell our interview teams: Let the candidate know there is space for them here,” she says. “Because some of our candidates turn into our employees—and they’re our best asset.”