Quantcast

Limiting Personal Information of Job Candidates Can Lead to More Fair Hiring Practices

For more than 15 years, management and organizations professor Rick Larrick has insisted his students at Duke University’s Fuqua School of Business leave their names off of their exams before turning them in.

Instead, they should identify their work with only a student number so Larrick can’t see whose essay he is reading. As a social psychologist who researches decision-making and the influence of bias, Larrick said he adopted this policy to judge students’ work as accurately and fairly as possible.

“By the time I’m reading an exam, I often have opinions about which students have done a great job in class, and which ones have not,” said Larrick, who also serves as Fuqua’s Associate Dean for Diversity, Equity and Inclusion. “If you have a favorable impression of them, that could leak into the process of reading their open-ended essay and interpreting what they are trying to say – especially any parts that are unclear or ambiguous.”

‘Blinding,’ or restricting information when making an evaluation or decision, has been used for decades to limit the impact of racism, stereotypes and other biases. More than 50 years ago, symphony orchestras were holding blind auditions to reduce the influence of gender stereotypes in hiring.

Yet, according to ongoing research Larrick described in MIT Sloan Management Review, less than one-fifth of hiring managers work at organizations with blinding policies. With an average of 14 years of experience, just one-fifth of managers in a study Larrick led said they had ever received training in the technique.

Larrick explained some of the findings and what people can do to limit their own biases.

What is ‘blinding’ and why aren’t more organizations using it in hiring?
Blinding is restricting information to reduce its influence on decision making. For example, a job candidate’s name may reveal their gender, or even suggest their age, race, ethnicity or nationality. To reduce bias and increase fairness, names are removed from applications or résumés until after top candidates have been chosen for interviews.

Our research suggests managers typically want all information available about a candidate – name, photo, race, gender, etc. In some sense, this is an automatic habit, since we are used to learning these things about people we meet in person. It’s also driven by curiosity. But as research has shown, these details do contribute to bias. Our main innovation has been developing techniques to abate this automatic habit and curiosity.

How can people stop bias from creeping into these decisions?
Curiosity is a strong motivator for most people. However, our series of studies showed we could nudge people to the right mindset. Before participants looked at any candidate information, we reminded them that their goal was to make an unbiased decision. We then asked them to consider what information was necessary to achieve this goal, and what information they should avoid. People were much less likely to look at extraneous information if we prompted them in this way.

This was the most surprising discovery – people readily acknowledged, upon reflection, that some biographical information could create bias and they shouldn’t see it. So many other biases have a different pattern – one of overconfidence, in which people think ‘I’m not biased.’ In this case, people do recognize they might be biased and welcome the help that blinding offers.

Another way to satisfy an evaluator’s curiosity is to promise to share more information about a candidate, but only after he or she has made an evaluation. This sequencing is essential to make blinding work.

If we simply delay when people receive the additional information (name, gender, etc.), we keep that information from influencing what is read or heard. The ‘music speaks for itself,’ without biases tainting the interpretation. And the evaluator is happy because they can satisfy their curiosity and be more accurate, since it reduces the potential role of bias in their judgment.

What will persuade more businesses to use blinding or similar techniques?
Overall, research to this point has considered blinding as a top-down policy that would be implemented by wise managers. Our studies turn that around to see whether individuals recognize it as a wise strategy – whether it’s something they would adopt as a personal choice, similar to how I developed my own blinding practices for grading.

Our studies with individuals may also help firms implement company-wide blinding policies. By leading people through experiments like the ones we conducted, employees can first see for themselves that certain information can create bias, which could increase their interest in using these techniques.

One clever thing tech firms have done is, instead of relying on résumés and interviews to hire programmers, they have moved toward administering tailored tests where they can view results without any information on the test-taker. That way, they can evaluate a person’s skills without forming positive or negative biases based on other information, such as where the person went to college.

Can masking candidates’ biographical data help increase diversity?
We don’t want to oversell blinding as a technique that solves all problems. When trying to build a more diverse workforce, it’s important to take candidate backgrounds into account in the selection process, and to work hard to increase representation in the candidate pool for jobs that have historical disparities.

Our key message is that there can be times to ‘let the music speak for itself’ (or exam answers, or programming skills). We think people welcome the benefits of blinding when you make them reflect on these insights.




The material in this press release comes from the originating research organization. Content may be edited for style and length. Want more? Sign up for our daily email.