Ageism in hiring: The bias we don’t talk about
2026-02-20
Fredrik Törn
At some point in many professionals’ careers, age starts working against them. Too young. Too inexperienced. Too old. Too expensive. We’ve all heard the coded language. And the uncomfortable truth is this: age bias is one of the most widespread forms of discrimination in recruitment, yet one of the least addressed. That creates a real dilemma for hiring teams.

On one hand, recruiters and TA leaders genuinely want to make fair, objective decisions. On the other, they operate under pressure with high volumes, limited time, and increasing expectations to move faster. In that environment, shortcuts happen. Assumptions creep in, and bias, often unconscious, influences decisions.

Bias has always been part of recruitment

Let’s start with another uncomfortable truth: recruitment is deeply affected by human bias.

Numerous meta-studies on age discrimination in hiring show the same pattern  (see below for a selection). Candidates above a certain age face significantly higher barriers for the exact same roles, ironically despite a stronger experience. And this happens even though age discrimination is illegal in most Western countries.

Most of the time, the bias isn’t explicit. It’s latent.

Recruiters don’t wake up thinking, “I’m going to discriminate today.” But “gut feeling” is often just a collection of unconscious stereotypes.

From a quality standpoint, this isn’t just unfair, it’s also inefficient. If a decision is influenced by age, name, or background instead of competency, the shortlist becomes weaker.

How Responsible AI can reduce age bias

Here’s something important: AI - when built responsibly - creates a real opportunity to combat bias, and data-driven systems are inherently better at uncovering bias than purely human-led processes. 

Why? Because a machine’s decision-making logic can be observed, measured, and audited.

  • In a human process, the real inputs are often invisible:
  • The tone of a candidate’s voice.
  • A shared hobby.
  • A recruiter’s mood that day.
  • An assumption about “energy” or “cultural fit.”

Those variables are never recorded. With structured, responsible AI, every evaluation is based on predefined criteria. Every variable can be reviewed. Every outcome can be compared.

That transparency creates accountability.

When AI is built around structure, well-proven methodologies, and consistency it reduces the space where unconscious bias thrives. It will still be under human oversight, but machines remove the dark side of (poor) human judgment.

And that’s the desired state: Recruiters can feel confident that hiring recommendations don’t reflect hidden biases that expose them to legal or reputational risk.

Candidates can feel confident that assessments are monitored for equity, and that no demographic group is unfairly disadvantaged.

That’s not just good ethics.

It’s good hiring.

Want to learn more?

That’s why we’re hosting a live panel on Ageism, Accountability, and the Future of Fair Hiring, together with Simon Bucknell.

We’ll discuss what ageism really looks like in today’s hiring processes, why it’s still overlooked and how AI can be used responsibly to create fairer outcomes.

If you care about building hiring processes that are not just efficient, but equitable, I encourage you to join us here.

Want to read more?

For a selection of more reading on ageism in hiring, these may be a good starting point. There is plenty more on the topic, as you will see, which I guess is a sorry testimony to how huge this problem still is. I wish I could say “happy reading”, but at least it is enlightening. And the Neumark study (no 3) really resonates with our stand on using machines for the early stages of assessment to minimize risks of ageism. 

Two metastudies:

1. Batinovic, L., Howe, M., Sinclair, S., and Carlsson, R., 'Ageism in Hiring: A Systematic Review and Meta-analysis of Age Discrimination', Collabra: Psychology, 9/1 (2023), 82194. Available at: https://online.ucpress.edu/collabra/article/9/1/82194/197046.

2. Harris, K., Krygsman, S., Waschenko, J., and Rudman, D. L., 'Ageism and the older worker: A scoping review', The Gerontologist, 58/2 (2018), e1–e12. Available at: https://academic.oup.com/gerontologist/article-abstract/58/2/e1/2894393.

One recent US study advocating “Age-Blind” steps in the hiring process to combat age discrimination:

3. Neumark, D., 'Age Discrimination in Hiring: Evidence from Age-Blind versus Non-Age-Blind Hiring Procedures', Journal of Human Resources, 59/1 (2024), 1–34. Available at: https://jhr.uwpress.org/content/59/1/1. Or as a working paper here: https://www.nber.org/system/files/working_papers/w26623/w26623.pdf

And here is one Swedish study where researchers sent out 6000 (!) fictitious resumes with randomly assigned information about age (35–70 years) for vacancies in low- and medium-skilled occupations. As you might have guessed now, call back rates were lower for people 40+, especially for women.

4. Carlsson, M., and Eriksson, S., 'Age discrimination in hiring decisions: Evidence from a field experiment in the labor market', Labour Economics, 59 (2019), 173–183. Available at: https://www.sciencedirect.com/science/article/abs/pii/S0927537119300259.

Implementation period
Insight
Ageism in hiring: The bias we don’t talk about
February 20, 2026
Fredrik Törn
Contact
Give us a call
General inquiries
hello@hubert.ai
Swedish office
Vasagatan 28, 111 20 Stockholm, Sweden
Update cookies preferences