The Science Behind Unconscious Bias - And How It Affects Hiring

Published by:
Joe Caccavale
July 20, 2020
11
min read

Anonymous CV Toolkit

If anybody ever tells you they don’t have unconscious biases, they’re lying!

Unconscious bias is a part of human nature. The more life experience we have, the more chance there is of inherent biases shaping our perception of the world around us.

For a fairer, more equal world, we have to do something about unconscious bias and hiring bias. It's the only way that we can create a culture of inclusion.

But the first step is accepting that it's inevitable: we all make quick judgements based on personal experiences, and these judgements are often informed by hidden biases.

Just telling people about their unconscious bias is as much use as a chocolate teapot.

So, here’s the actual science behind unconscious bias, and how Applied's hiring software can remove varying types of bias from the recruitment process and help leaders make the right hiring decisions.

Unconscious bias definition

Unconscious bias is, essentially, any prejudices we may have, of which we are unaware. 

We naturally categorise others based on physical qualities and background - from ethnicity to education.

Not all of these biases are necessarily discriminatory. Hiring, for example, is largely affected by implicit biases we have towards certain candidates, rather than against other candidates.

As subjective individuals, we naturally gravitate towards what is familiar to us based on our own unique life experiences. This means that certain attributes (e.g. ethnicity, gender, class) have the power to entirely cloud our judgement.

Types of unconscious bias

These are some of the most common types of unconscious bias:

Perception bias

Perception bias is when we believe something is typical of a particular group of people based on cultural stereotypes or assumptions.

Affinity bias

The term 'affinity bias' describes our tendency to feel as though we have a natural connection with people who are similar to us.

Halo effect

When we project positive qualities onto people without actually knowing them, we have become susceptible to the 'halo effect'. Conversely, the 'horn effect' describes how our initial opinion of someone means we continually associate them with negative characteristics.

Confirmation bias

We generally look to confirm our own opinions and pre-existing ideas about a particular group of people. When we enter a situation hoping to have our initial expectations met, this is often an example of confirmation bias.

Attribution bias

Have you ever noticed that we tend to be more forgiving of ourselves than others? This is usually due to attribution bias. This term describes how we attribute others' behaviour to internal characteristics and our own behaviour to our environment.


Having these biases doesn’t make you a bad person.

It makes you human.

As the name suggests, unconscious bias is unconscious… which is why it’s so tricky to eliminate - we don’t know we’re doing it! That's why our hiring software has been specially built to eliminate unconscious bias by design.

The way our brains naturally process decisions means that we, as humans, are prone to unconscious bias...

Our brain uses two systems to make decisions

In our candidate screening webinar, Applied CoFounder, Kate, talked through a quick thought experiment that demonstrates how easily our brain can misfire and mistakenly recall information.


This misfiring means that our intention doesn’t always result in equal action.

Even though we may set out with the best of intentions, the way our brain is wired and the context of the decision making can affect how our intention translates into action.

How we feel at the time, past experiences, the way information is presented to us and even who imparts this information to us can all influence decision making.


Our brain makes 1000’s of decisions every day.

And as the experiment above shows, can also misfire 1000’s of times.

If we look at the sorts of decisions we make day-to-day, there are big, important decisions, such as planning a project at work.

And there are minor, everyday decisions, like deciding which route to take home.

Decision fatigue is a real thing. Your ability to make good choices deteriorates the more decisions you make.

So to lighten the load, our brain’s decision-making is broken down into two systems - one fast thinking, and one slow thinking, a theory popularised by Daniel Kahneman’s Thinking Fast and Slow.


System 1: used for everyday, intuitive decisions. These are the sort of decisions that if you had to think long and hard about every one of them, you’d likely have a meltdown - like how much milk to put into your tea, what top to wear in the morning, or when to cross the road. For the most part, this sort of decision making is a bit like being on autopilot.

System 2: used for bigger, more important decisions like planning a trip or working on a big presentation. System 2 thinking tends to be slower, considered, and more mentally taxing.

Having these two systems is what allows us to stay productive and sane. 

We make so many tiny decisions throughout the day that we need system 1 to take the brunt of them so that we reserve our capacity for conscious and informed decision making for when it’s actually needed.

Our brains use shortcuts and patterns to draw conclusions, by using the information subconsciously stored in our mental lockers: this includes things like our past experiences and upbringing, what we’ve seen in the media and our individual opinions. 

All of this allows us to make swift assessments of situations.

However, we often use system 1, when we should be using system 2 (as is often the case in hiring, which we’ll get to shortly).

This is why we have unconscious bias.

Since system 1 thinking relies on subconscious shortcuts and a ‘gut instinct’ of sorts, unconscious bias arises when we jump to mental shortcut-based conclusions instead of using the slower, more conscious part of our brain.

Unconscious bias in hiring

Unconscious bias plays a major role in traditional hiring - we tend to resist the unfamiliar.

When faced with a pile of CVs, our system 1 brain seems to kick in, and we make snap-decisions based on rapid-fire associations.

Although unintentional, this can result in the unfair favouring of or discrimination against candidates… and it runs deeper than the old trope of male bosses hiring in their own image.

Recruitment bias can be costly. For instance, according to PeopleKeep, hiring the wrong person for a sales role can cost a company up to 75% of their annual salary. A data-driven hiring solution such as the Applied platform can save significant costs over time.

Unconscious bias in recruitment examples

  • We might see that someone went to a good university and automatically assume this makes them intelligent.

  • We might rule out qualified candidates who we perceive as different from their potential colleagues, on the basis that they might not be the right 'culture fit'.

  • We might see that someone is older than the average candidate and assume they’re less ‘hungry’ or capable of using the latest tech.

  • We may show unconscious bias towards candidates who remind them of people they’ve had positive experiences with - and once we’ve made up our minds about someone, we look for reasons to keep liking them.

  • We might see someone’s address and - due to a bad experience with that area or stereotypes around its inhabitants - rule that candidate out.

Just by glancing over someone’s CV, any number of biases could be triggered.

What you might think of as being your intuition or trusty gut instinct, is actually just your brain making rapid-fire shortcuts to speed up decision making…

Decision making that should be anything but quick and intuitive.


As you can see above, almost all of the information provided on a CV is grounds for unconscious bias.

And the same can be said for LinkedIn profiles. If you look at my profile, the bias triggers play out much like they do for CVs. 

LinkedIn profile unconscious bias


The result of biased hiring decisions: minority groups are disproportionately overlooked.

In a 2004 US study, around 5000 resumes with either African-American- or white-sounding names were sent out to a variety of companies, measuring the number of interview invites for each.

Half were assigned "remarkably common" African-American names, and the other half were assigned white-sounding names, such as Emily Walsh or Greg Baker... 

Job applicants with white names needed to send about 10 resumes to get one callback; those with African-American names needed to send around 15 resumes to get one callback.

A candidate with a black-sounding name would need an additional 8 years of experience to close this 50% callback gap.

Another German study conducted a similar CV-based test...

Job applications for three fictitious female characters with identical qualifications were sent out in response to job advertisements: one applicant had a German name, one a Turkish name, and one had a Turkish name and was wearing a headscarf in the photograph included in the application.

Here are their findings:


Research here in the UK ended up with similar findings too.

Inside Out London sent identical CVs from ”Adam" and "Mohamed" to 100 open roles... 


Adam was offered 12 interviews, while Mohamed was offered 4.

So, candidates with Muslim-sounding names were 3x more likely to be passed over for job interviews.

Unconscious bias in recruitment doesn’t just affect ethnic minorities… gender bias is also prevalent in the recruitment process

In a randomised, double-blind study science faculties from selected universities were tasked with rating students’ applications for a laboratory manager position. 

All applications were the same except for the names, which were randomly assigned either a male or female name. 

Faculty participants rated the male applicant as significantly more competent and hireable than the (identical) female applicant.

The mean starting salary offered to the male students was significantly (1.15x) higher too.

Did you know that the words you use in job descriptions can show a preference towards male candidates? Our job description analysis tool shows how gendered language (e.g. words such as 'ambitious' and 'competitive') can deter female job seekers from submitting a job application. It also offers gender-neutral alternatives and assesses readability, so companies can create job descriptions that attract a wide pool of candidates.

Many common biases are the result of stereotypes

Shocking results like those shown above are likely due to the fact that the system-1-based mental shortcuts we make tend to make stem from stereotypes. 

Stereotypes dictate the type of people we would expect to see in a certain role.

This stereotype bias is bred into us as children and continually perpetuated throughout our adult lives.

A quick Google search for a given job proves this to be true.


Google image results:

  • “Nurse” - 81% women
  • “Surgeon” - 68% men

And another example - the screenshot below is from the website of large, multinational company.

Although there’s a (albeit extremely slim) possibility this was a mere coincidence, men have been stereotyped as having technical careers, and women non-technical. No wonder the tech industry is often criticised for its lack of gender diversity. In fact, studies of workplace diversity at Apple, Google and Facebook found that approximately 23% of technical positions were held by women.

When hiring for any role, we may be more likely to favour candidates who ‘fit the bill’ for the role.

Given all of the behavioural science context above, ‘fitting the bill’ or 'culture fit' seems to equate to conforming to the stereotype of what someone in the role would look/ sound/ be like…

It also doesn’t help that we tend to better recall and value information that confirms our preexisting beliefs (confirmation bias).

Below are the results of a US study, in which perceptions of ‘out-groups’ were measured against warmth and competence.

As you'll see, we have many of these preexisting beliefs about others based on their belonging to certain groups.


When it comes to hiring decisions, they can be as much about matching the preconceived idea of what the person should look like as much they are about actually finding the most skilled candidate.

If we generally think of surgeons of white males between 30 and 50, is it any surprise that these are the people who are hired into these positions?

It’s not just about someone’s characteristics - context matters too

Even if we were able to remove all unconscious biases around personal backgrounds and identity, the context in which we receive information can still majorly influence our decision making. 

In one Israeli study, judges parole-grating decisions were measured over the course of a day.

Here are the results:


Generally, it seems that the judges made harsher decisions over time.

This is because the more decisions we make (and the more ‘decision fatigue’ we endure), the more risk-averse we become.

In the case of these judges, they were more likely to make favourable (and in the context of parole-granting, riskier) decisions earlier in the day, when they were least fatigued.

And do you see the spikes in favourable decisions?

These were following the judges’ breaks.

The key takeaway here is that we are heavily influenced by ordering biases. The order in which information is given to us affects how we view that information.

The only way to make truly objective hiring decisions is to remove unconscious bias by design

Companies throw billions at the unconscious bias problem in the form of training.

Corporate diversity is a lucrative market, with Google alone spending $114 million on diversity programmes in 2014.

But unconscious bias training is a waste of time.

It just doesn’t work.

A meta-analysis of 426 studies found that although there was a reduction in bias immediately following the training sessions, the effects wore off after around 8 weeks.

Since we're mostly unaware of our biases, training alone can only go so far.

An unconscious bias training workshop might have some value, but if you have to run training sessions every few weeks to make an impact, they're not the most cost-effective or efficient means of removing bias.

Although awareness alone can’t fix the issue, we can remove bias from our decision making by designing processes and investing in recruitment tools that eliminate potential biases.

At Applied, we set out to do just this -fight unconscious bias in recruitment by rethinking the recruiting process itself.

We did this by building a blind hiring platform that assesses candidates anonymously, using only the most predictive forms of assessment (with not a single CV in sight). 

In the spirit of our pursuit for fairer hiring, we’ve laid out our process from start to finish in this (free) resource, so that you can set up an unconscious bias-free hiring process too (and no, you don’t need to use our platform to achieve this - although it’ll certainly make it a walk in the park).


Eliminate unconscious bias by design with the Applied recruitment platform

Our data-driven recruitment software helps managers implement equitable hiring practices based purely on skills and merit. By removing unconscious bias from the hiring process, companies can create a culture of diversity and inclusion, whilst also improving their staff retention efforts.

Our software offers a wide range of skills-based questions and tests, with predictive assessments so only the best candidates are selected for interview. You can also use Applied to manage the entire interview process, from sending invites and scoring how candidates perform in interviews.

Not only do we ensure a simple and enriching candidate experience, but we provide live diversity and performance data on what channels are producing the best quality candidates. This will help you reach your diversity goals whilst also optimising your recruitment funnel.

Applied is the essential recruitment platform for fairer hiring. Purpose-built to make hiring ethical and predictive, our platform uses anonymised applications and skills-based assessments to improve diversity and identify the best talent.

Start transforming your hiring now: book in a demo or browse our ready-to-use, science-backed talent assessments.