Skip to content

AI in Recruiting, a Cheat Sheet for Recruiters

AI in Recruiting, a Cheat Sheet for Recruiters

For too long, the playing field has been uneven for job seekers, particularly for women candidates and candidates of color. Thanks to new technology now available—AI in recruiting, and machine learning designed to help remove bias from the hiring process—that’s changing.

No matter how open-minded any of us may strive to be, we all have our own experiences and beliefs that affect our decision making. Researchers from the University of Washington and Yale studied this concept, known as unconscious bias or implicit bias, and found it to be so pervasive that it affects nearly everyone (90-95% of people). When it comes to hiring, this near-universal unconscious bias has far-reaching effects, with several studies indicating how candidates’ gender and race impact their chances of being hired. One found that job applicants perceived to be white were 74% more likely to receive positive responses. Another study conducted by Princeton found that female orchestra performers were more likely to be hired via an anonymous audition format.

What Is Machine Learning in Recruiting, Anyway?

For those who aren’t exactly sure, you’ve come to the right place. A type of artificial intelligence, machine learning is a relatively new technology in the field of recruitment, one that’s growing in popularity and has been adopted by major companies across industries. Perhaps that should come as no surprise, given that, as Forbes reports, the average recruiter spends 13 hours per week selecting candidates for just one position, while AI can sort through millions of data points to quickly vet and narrow in on job-seeker profiles that are a strong fit for a given role.


7 Ways Recruiters Are Leveraging AI Recruiting Software

Despite the rapid adoption of machine learning —69% of those working in talent acquisition say AI has helped them hire stronger employees—it’s relatively new territory, one many in the field are just becoming familiar with and may not yet fully understand. For a quick overview, here are some of the top ways companies are utilizing AI in recruiting, including:

  • Finding candidates via company databases, Facebook, LinkedIn, and other websites, and predicting how likely they’ll be to leave their jobs and accept a new offer.
  • Reviewing job descriptions for bias and updating them to be more inclusive.
  • Determining who job listings are shown to and what descriptions are used, according to a study of hiring algorithms.
  • Removing human bias, identifying candidate matches, screening candidates, and nurturing relationships with candidates, according to 9,000 recruiters and hiring managers surveyed by LinkedIn, of whom 67% say it’s overall a time-saving help.
  • Reviewing resumes: University of Pennsylvania’s Wharton School business economics researchers who’ve studied the hiring (dis)advantages conferred by race and gender, say using artificial intelligence to screen applicants is a growing trend.
  • Removing names and photos from the application bias to minimize bias.
  • Conducting personality assessments at scale.

How to Evaluate AI Recruiting Software

When AI recruiting software is designed well, it can help limit the effects of unconscious bias, and, as a result, minimize preferential treatment or discrimination based on gender and race. Unfortunately, however, artificial intelligence technologies—and not just those for recruitment purposes, but also those powering everything from Google Translate to tools that screen for hate speech online—can be built in ways that perpetuate the bias of the humans who create them.

When it comes to bias in recruiting tools, these AI technologies can operate in unintended ways—and serve the complete opposite purpose from that which was intended, extending rather than overcoming bias in the hiring process.

When AI recruiting software works, however, it can help companies foster diversity and inclusion, by identifying candidates with a track record of high performance, rather than focusing on factors such as grades, education, gender, and race.

To ensure AI advances forward to meet the goal of continuing to make recruitment fairer for everyone, recruitment and data science teams need to work together. Another key is to contextualize recruitment data models through measures that demonstrate candidates have overcome unfair disadvantages.

“Bias is very much a human issue, not technological – so let’s stop pointing the finger at AI and Machine Learning. To overcome bias, the developers and the humans behind those technologies need to ensure processes and behaviours are fair. We empower our team to step-up, speak-up, and be the changemakers our industry needs.” Gareth Jones, Headstart CEO.

Learn More About Machine Learning in Recruiting

Removing implicit bias from the process not only benefits job seekers, but companies as well. McKinsey & Company has found that businesses that make a commitment to diversity and inclusion are more successful, with financial returns above their less diverse peers.

Ready to take the next steps with AI in recruiting? Discover how Headstart’s contextual algorithm, our proprietary AI recruiting software, has helped Accenture and Smiths and more recruit the next generation of talent, resulting in a 100% increase in female hiring and a 2.5-point increase in hires among applicants of color.

Get HR Insights In Your Inbox

Subscribe to our newsletter for original insights and bias-busting resources, every month.

What is conscious leadership, and how can it help make recruitment more equitable?

Hard as you might try, it’s very difficult to build an inclusive organization without switched-on leaders who are committed to…

Why creating a DEI team is only half the battle

You know that focusing on diversity, equity, and inclusion is important for your company. You’ve probably seen the stats a…