Few topics in the recruitment sector have proved to be more controversial than the use of artificial intelligence (AI). On one hand, everywhere you turn there are endless statistics about how AI is infiltrating business processes. According to LinkedIn, an overwhelming majority of recruiters and hiring managers agree that AI accelerates their ability to source and screen candidates. What’s more, 70 per cent of recruiters believe their current process would be more effective if it were more data-driven and used AI.
However, on the other side of the fence, research commissioned by the Royal Society of Arts suggests candidates don’t agree, with 60 per cent of the public stating they’re opposed to the use of automated decision-making in recruitment. In fact, only 14 per cent of people are aware that automated decision systems are already being used in hiring processes today.
Oren Etzioni, the chief executive officer of the Allen Institute for Artificial Intelligence, encapsulates the double-edged nature of the technology best: “AI is a tool. The choice about how it gets deployed is ours.”
So, how can recruiters and hiring managers navigate this complex problem? To start, they must ask themselves not what AI can do, but rather what it should do.
Should recruitment teams incorporate AI into their processes?
The sweet spot for the vast majority of AI products on the market today is taking commoditised tasks and automating them to create efficiencies. For example, AI can support recruiters with analysing and prioritising the thousands of cover letters, CVs and application forms they are handling each week.
Recruiters should also explore chatbots like AllyO and Mya that can provide immediate feedback to candidates on application, answer FAQs and even help to schedule early-stage interviews. The global cosmetic company L’Oréal, which fills about 15,000 new positions each year, is using Mya to manage the candidate relationship at the initial stages of their recruitment process.
Should AI be used to support more inclusive hiring practises?
In a word: Potentially
With diversity and inclusion an increasingly important business priority, organisations need to recognise that even their most well-intentioned recruiters are prone to implicit bias. One academic study revealed that minority applicants who ‘whitened’ their resumes were more than twice as likely to receive calls for interviews. When it comes to fairly assessing job candidates, humans clearly need some help.
Fusing neuroscience and AI has shown promise when it comes to counteracting human bias particularly for organisations that are receiving a high volume of CVs. Last November, London-based start-up Headstart raised US $7 million in seed funding for prototype AI recruiting technology that, it claims, is able to reduce bias in hiring by 20%. The technology has already helped the global consulting firm Accenture to increase the hiring of women by 5% and BAME candidates by 2.5%.
Nevertheless, companies must also be aware that the data used to train AI will reflect existing bias. For example, Amazon’s experimental recruiting algorithm actually ended up disadvantaging female applicants because it was based on 10 years of CVs that reflect male dominance across the tech industry. Consequently, the project was scrapped, and the company stopped using AI for its hiring practices.
Should AI define what makes a ‘good hire’?
Well, what constitutes a good hire?
The answer to this question changes from company to company, and often department to department. Sometimes technical skills are more important than management capability – and vice versa. Many leaders have hired candidates that are excellent on paper but fail to deliver on the job. The context very much matters, which is important as context also changes.
One exciting case study of AI’s potential here can be found in the processes of automaker Tesla. Candidates apply for jobs via LinkedIn, and then play neuroscience-based games, which are designed to reveal the cognitive and behavioural traits the company believes are important. It’s an interesting example of AI being used to narrow the field further and screen out candidates that don’t fit a company’s priorities or culture.
However, finding the right candidate for the right job needs more than AI-backed psychometrics. The fact is that AI isn’t intelligent enough to spot those specific ‘X -factor’ traits that an employee needs to succeed in a specific organisation, department or role. As Adina Sterling, an professor in organisational behaviour from Stanford, concluded in a recent interview with Recode: “Algorithms are good for economies of scale. They are not good for nuance.”
Should AI replace humans in the recruitment process?
In our recent HR Tech Trends 2020 report, while we predict that AI will continue to excel at routine, repetitive tasks, we also acknowledge that humans will continue to outperform AI in areas such as critical thinking, socio-emotional intelligence, and unique problem solving for a long time to come.
Organisations need to view AI systems as tools that can accelerate and inform their recruitment processes and create much-needed bandwidth for busy teams, but not as ‘silver bullets’ that are capable of replacing, or frankly even mimicking, the critical role of experienced recruitment professionals.
The best approach right now isn’t going ‘all in’ on artificial intelligence but rather to slowly introduce AI tools into your recruitment processes, combined with rigorous testing and optimisation. AI will transform recruitment, but it will take time, effort and most importantly, human ingenuity.