Making recruitment more inclusive: What we learned from Securitas Sweden
2026-04-14
Fredrik Törn
AI and automation are transforming how companies recruit. In high-volume hiring especially, technology can help organizations process thousands of applications, structure interviews, and create more consistent decision-making. But introducing technology into recruitment also raises important questions: Can AI help reduce bias?
A recent collaboration between Securitas, Hubert, and researchers from the Institute for Futures Studies (IFFS) explored exactly this question. The goal was to understand how a more structured, technology-supported recruitment process could impact fairness and inclusion in hiring. The results offer valuable lessons for Talent Acquisition teams everywhere.
Bias in recruitment: What research shows

Historically, gender discrimination has been a significant issue in hiring processes. However, research in Nordic countries shows that explicit gender discrimination has decreased significantly in recent decades.

That doesn’t mean bias has disappeared.

According to Moa Bursell, researcher at Institute for Futures Studies, bias in recruitment often appears in more subtle ways. These include:

  • Unconscious bias, where recruiters unintentionally favor certain candidates
  • Structural bias, where hiring criteria unintentionally favor one group over another
  • Process bias, where certain stages of recruitment introduce unequal outcomes

Because these biases are often invisible to decision-makers themselves, they can be difficult to detect in traditional recruitment processes.

This is where structured and data-driven approaches can help.

A high-volume recruitment challenge

Securitas recruits approximately 2,000 employees per year in Sweden, from roughly 45,000 applications annually.

Most roles are operational security positions such as:

  • Security guards
  • Control room operators
  • Airport security officers
  • Protective security staff

These roles attract a large applicant pool but the industry remains strongly male-dominated, with about 25% of applicants being women.

To manage these volumes while ensuring fairness, Securitas Sweden introduced structured digital interviews through Hubert. Candidates who met basic eligibility requirements were invited to complete an interview where they answered standardized questions about motivation, experience, and competencies.

The goal was simple: evaluate candidates based on relevant skills and potential, rather than assumptions from CVs.

When data revealed an unexpected pattern

Shortly after implementing the new process, Securitas’ recruitment team observed a pattern in the early interview scores.

Women were on average scoring slightly lower than men in the initial interview stage.

Because improving diversity and fairness in hiring has long been a priority for Securitas, the team immediately investigated the results.

Together with Hubert and independent researchers, the team analyzed the scoring criteria in detail. The issue turned out not to be the AI itself, but the evaluation criteria used in the process.

Several factors contributed to the imbalance:

  • Bonus points for previous security training
  • Extra points for many years of experience in the industry
  • Experience with night work

Because the security industry has historically been male-dominated, these criteria unintentionally favored candidates who already had industry experience – something Securitas quickly identified as an opportunity to refine the model.

Adjusting the process

The insights from the analysis led the Securitas team to refine the recruitment model and introduce several changes:

  • Reduced the weight of bonus points for previous industry experience
  • Expanded what counted as relevant experience, including roles in healthcare or customer service
  • Added a question about general work experience, capturing broader competencies
  • Reviewed question wording and candidate experience to make the process clearer and more welcoming

These changes shifted the focus from narrow industry background to skills, motivation, and potential.

The result? Within a short time, the slight difference in average score between genders leveled out.

Key lessons for Talent Acquisition teams

The project highlights several important insights for organizations adopting AI in recruitment.

1. Technology is not a “set and forget” solution
AI tools must be continuously monitored and evaluated.

2. Bias often comes from criteria, not algorithms
The biggest risks may lie in how hiring requirements are defined.

3. Collaboration matters
Close cooperation between recruiters, technology providers, and researchers helped identify and fix issues quickly.

4. Human responsibility remains essential
Even with AI support, recruiters remain accountable for hiring decisions and candidate communication.

The future of fair recruitment

AI and automation have the potential to make recruitment more consistent, transparent, and scalable. But technology alone cannot guarantee fairness.

Organizations must combine structured processes, thoughtful criteria, and continuous evaluation.

As this project shows, when companies actively monitor outcomes and adapt their processes, technology can become a powerful tool for building a more inclusive workforce.

Want to learn more?
Implementation period
Insight
Making recruitment more inclusive: What we learned from Securitas Sweden
April 14, 2026
Fredrik Törn
Contact
Give us a call
General inquiries
hello@hubert.ai
Swedish office
Vasagatan 28, 111 20 Stockholm, Sweden
Update cookies preferences