Long reads

Would you let AI recruit for you?

12 Dec 2017 By Robert Jeffery

How algorithms are rewriting the rules of who we hire (and what that means for your job...) 

A job at Google is meant to be the very apex of aspiration for the ambitious modern worker. But not, it seems, if you’re working in Google’s recruitment function. In an article in The Washington Post, a ‘talent channels specialist’ at the web giant recently painted a dystopian vision of life in its cavernous Californian HR department, where he spent his days filling in templated emails, asking questions from a spreadsheet and scheduling interviews he was too lowly to conduct.

“I scour LinkedIn, a factory farm of fluff, for engineers with a specific skillset and then send hundreds of canned messages to unsuspecting professionals each week,” he lamented, describing conversations with candidates “comparable in depth and variation to a drive-through order at Burger King”.

If his existence seems monotonous (and accepting that even boring jobs can be more fun if you focus on the positive) there’s good news on the horizon. Within a few years, such tedious recruitment jobs may be done by something more robotic than human.

Algorithms deploying sophisticated machine learning are reshaping recruitment faster than any other part of HR. That’s not surprising to technologists: recruitment (and not just in the Googleplex) is full of routinised actions, relies on analysing swathes of data and for decades has hinged on humans making the sort of judgements we’re heroically bad at.

It’s simplicity itself to make a case for machines being better than us at most of the recruitment cycle. They’re also, inevitably, better value for money, at least in the short term. Which leaves us with two competing scenarios: one in which recruiters work with AI as Moneyball-style experts, using data insight to supplement their intuition and excelling at relationship building. And another in which the recruitment department is – in the words of the provocative recruitment entrepreneur Sergei Sergienko – “filled up with computers” trawling the web, ranking applicants and conducting chatbot conversations with limited human involvement.

To decide which is the more likely future, you have to follow the money. And at the moment there’s a lot of it about. Consultancy CognitionX says it is tracking 300 HR-related tools that use machine learning; more than 100 of them are focused on recruitment. In the US, the HR assessment and recruitment market has been valued at $100bn and major players such as HireVue, which offers video analysis of candidate interviews, have scooped eight-figure funding. In the UK, the market is similarly excitable.

“There are a lot of early adopters, but we’re still relatively early in the process and HR generally isn’t that well advanced [in its use of AI],” says Ed Janvrin, head of research at CognitionX. “But it’s coming. A lot of HR departments are having those conversations now.”

He outlines a range of areas where activity is buoyant: candidate sourcing; compatibility matching (using psychometrics or other forms of selection); predictive analytics around new hire performance; full recruitment platforms focused on improving candidate experience; and video interviewing.

Within those categories there lie multiple possibilities. Algorithms can write job descriptions that eliminate any form of biased language and remove all trace of protected characteristics from applications. Services such as HireVue can minutely track facial expressions in video interviews to make judgements on suitability based on body language and tone of voice. Chatbots can take over frontline conversations with candidates. ‘Data mining algorithms’ can search social media postings for context that might support an application (often without an individual’s permission, which has led to legal action in the US). And if you’re writing your own emails or LinkedIn messages to prospective hires? You’re probably wasting your time when a multitude of apps can suggest wording that will reel them in.

Such is the level of scrutiny machine learning is subjecting some aspects of recruitment to, at least one vendor will minutely analyse the way a candidate fills in an online application form – deducting marks for excessive use of the backspace key, which is taken to indicate indecision or evasiveness.

If it all sounds impossibly futuristic, in reality the technology is often just a more intelligent adjunct to the sort of applicant tracking systems we already use.

“There’s still a big gap between what can be done and what people are actually doing,” says Dr Tomas Chamorro-Premuzic, professor of business psychology at University College London and author of The Talent Delusion. “What happens in the field of workforce analytics at the moment is quite basic. Most workforce data is unstructured, and there are legal and ethical restrictions around its use.”

There are reasons to suspect the market will accelerate – principally, that technology’s biggest beasts are now showing an interest. Google’s Hire is a new cloud-based applicant tracking system that complements the Google for Jobs search engine. LinkedIn is integrating with the Outlook email platform and Microsoft has introduced a Resume Assistant to try to standardise CV data and job descriptions.

The opportunity for Silicon Valley is not just that recruitment is big business – £35bn per year in the UK alone, says the Recruitment & Employment Confederation (REC) – but that not even the most generous scientist would attempt to defend the hugely subjective way most of us hire.

“Most interviews are a waste of time because 99.4 per cent of the time is spent trying to confirm whatever impression the interviewer formed in the first 10 seconds,” Laszlo Bock, former Google HR chief, has said. That’s probably an underestimate. We construct narratives to justify and support our snap decisions, we value anecdotal recommendations above hard data and we consistently fail to scrutinise candidates’ claims.

“For around 50 years, we’ve known that structured or semi-structured interviews can work really well if they are performed by trained people and you have two or three independent weighters who are blind to other information,” says Chamorro-Premuzic. “But we also know that people are incapable or unwilling to do that, so most interviews are still done in a really sloppy way.

“In the last five or 10 years, we have been rebranding HR as people analytics in an attempt to make it more evidence-based. But if you look at what most firms do around talent acquisition, they hold an interview and make a decision off the back of that. It’s not objective.”

AI cannot force us to admit our fallibility, but it can introduce some objectivity into proceedings. It does this more easily in some settings – particularly graduate and volume recruitment, and knowledge or tech work – than others. But to its adherents, it is a powerful tool. Unilever has been the Trojan Horse of AI recruitment, putting around 250,000 candidates globally through a system involving gamified psychometric testing followed by an analysed video interview and an algorithm-driven selection process. The company’s HR team is “overjoyed” with the results.

But there are, undoubtedly, issues to be ironed out. Most pressingly, there is no irrefutable proof that AI delivers better hires in the long term. As David D’Souza, the CIPD’s head of London, says: “It’s increasingly easy to measure speed of hire. But genuinely understanding that, of all the candidates out there, you’ve got the best person for the job in terms of fit and potential, is an impossible task.”

Unilever has impressive metrics – £1m in ROI in the first few months of its new recruitment systems; 80 per cent of AI-suggested candidates judged ‘good hires’; the ‘most diverse’ group of recruits it has ever seen – but others have found that some AI-driven systems are left wanting in terms of candidate experience or understanding of more niche roles.

AI could be a game-changer in unlocking ‘dark matter’ candidates who are perfect but passive (LinkedIn says fewer than 30 per cent of people are actively looking for a role at any one time). But as D’Souza points out, this is incongruous at a time when we’re trying to be blinder in our recruitment processes. He fears we will become pawns in a ‘beauty parade’, forced to make online profiles shine, just as a few years ago it was essential to load your CV with keywords so bots would find it.

This issue of information seepage will be brought to a head by the introduction next year of the General Data Protection Regulation (GDPR), which will require candidates to be explicitly told if they are subjected to an automated process. In itself, this may not be a game-changer, says Alison Woods, partner at CMS Cameron McKenna Nabarro. She points to the introduction of legislation concerning the use of cookies to track online activity – in reality, most people did not exercise their rights, just as Unilever says not a single candidate objected to its AI processes.

More concerning is the GDPR’s requirement for explicit consent before retaining and using candidate details. Woods is concerned about the “incessant collection of CVs by managers across the business, who then share them with others”, which could constitute misuse. “You need to explain the basis on which information is being collected and shared,” she warns. “While consent is generally the basis on which most HR data processing is undertaken, recruitment is often an exception.”

The GDPR may wake us up to AI’s stealthy creep into recruitment, just as the issue of embedded bias has already stirred some. By profiling existing high performers and applying their traits to candidates, algorithms have been found to replicate demographics too – if your top performers are young white men, AI will pick up on behaviours and keywords that lean towards this group. There is an assumption that, having been alerted to such biases, future algorithms can correct them – “where it’s been said that algorithms have eliminated or chosen against certain ethnic or social groups, that is only because the models have not been trained correctly”, says Chamarro-Premuzic – but that is arguably because humans in the process have been alert to the issue.

Others believe AI can be a powerful way to improve diversity, if it is harnessed correctly. NBCUniversal has deployed an algorithm-powered diversity tool from WCN, which runs blind screening on applicants and predicts suitability for interview before suggesting how a recruiter can target certain minorities more effectively if it isn’t achieving its targets. NBCUniversal had identified that BAME employees were underrepresented in both the UK and the US. There were also issues around women in technical roles.

Applying machine learning to better measure and quantify diversity among candidates has delivered a better applicant pool “and has also educated candidates on the opportunities that exist and their attainability to students we may not have reached in the past”, says Seldric Blocker, vice president of talent acquisition at NBCUniversal.

Blocker says algorithms are a powerful way to “predict outcomes” and improve time to hire, but human judgement must supplement AI, both to help secure talent and because humans react to a human face: “Candidates want to relate to people, and they won’t join the business without speaking to a person and understanding if they will definitely fit into the company culture.”

But the way to truly eliminate bias is not through AI, says Tom Hadley, the REC’s director of policy and professional services. He points to group interviews, more diverse panels and removing CVs from the process – which requires a level of oversight only humans can provide.

AI can always be viewed as a threat, but it can also constitute an opportunity if seen as a societal solution to inequity in recruitment. Here, many look to blockchain: the technology that enables direct, secure transactions between individual entities (and which spawned the bitcoin currency) could one day act as a verifiable source of candidate information. Eventually, our blockchain professional profiles might talk to potential recruiters and arrange the ideal job without our involvement. Bias – any form of subjectivity, in fact – would be a thing of the past.

Londoner Nicholas Shekerdemian is a convert to the ‘single source of truth’ that blockchain could offer, but admits that this is some way off. For now, he is focused on Headstart, the app he co-founded after dropping out of university and has watched grow from a handful of people to a staff of 17 in just a year, while also attracting enviable venture capital.

Headstart’s premise, like some of its rivals, is to augment human capability with data. As Shekerdemian says: “The best way to enhance recruitment is to give recruiters more data.” Organisations, he adds, “fundamentally don’t actually know how to measure how capable someone is”, which is why his software issues questionnaires to candidates and puts them through a psychometric assessment tailored to the requirements of the job. Time to hire, he says, is cut by 70 per cent and satisfaction with hires is up.

If Headstart gets enough traction – and enough cash – to build a sufficiently useful database of talent, it could become what its founder calls “LinkedIn on steroids”. But what Shekerdemian and others must ultimately rely on is recruiters realising that intuition is one of our most irrational traits.

Malcolm Gladwell memorably called intuition “feelings about facts”, which hints at its central dichotomy. Yet we have put it front and centre in recruitment. “If you are an expert, you can trust your intuition because it is already very data-driven,” says Chamorro-Premuzic. “But that is not the case for most people and… the best you can hope is to make them aware of their biases. Eradicating them is too much work for most people.”

Chamorro-Premuzic would, he says, be prepared to hire someone on the recommendation of an exceptional algorithm even if it went against his judgement on first meeting them. To most recruiters and hiring managers, however, the idea is ridiculous.

“I’ve spoken to many recruiters who believe they have an eye for talent,” says D’Souza. “At the end of the day, any of our intelligence and intuition can be boiled down to rules we have developed through experience [and can therefore be captured by an algorithm]. But we also have a high degree of confirmation bias – we forget our bad hires and emphasise the ones who worked out.”

Janvrin agrees that it is problematic to believe we will always know best, and says we will forever over-emphasise mistakes or biases that originate in algorithms as proof of our superiority. “We can take autonomous vehicles as a proxy for this debate,” he says. “Already, we see they are better than human drivers. The thing that stops them spreading is human attitudes to automation.”

And, he might add, human attitudes to being made redundant. For recruiters, the danger of automation is painfully clear, and Janvrin is among those sounding the alarm: “Over the next 10 years, more and more basic cognitive tasks will be automated. Initially, that will help recruiters have a more fulfilling time at work as they will do fewer repeatable tasks.

“But over time, you begin to need fewer recruiters. We should be concerned – the algorithms will get better at making good decisions, and I can’t help feeling that the human element will be squeezed out. It’s already happening in candidate sourcing, where hiring managers get a curated subset of applicants.”

There are other views, of course. D’Souza says there is an opportunity for recruitment to demonstrate its relevance by partnering with technology. It could also begin to apply the same degree of analysis and intelligence to the employees the company already has on its books.

Hadley says: “There will be an impact on back office roles, and we are seeing that already. Recruitment may need to evolve and think about where it can add value – perhaps by becoming genuinely consultative. But it used to be said that jobs boards were the end of the recruitment profession. Then it was LinkedIn. We have rethought our relevance in the face of those challenges, and we can do again.” An entire industry is hoping he is right. 

The CIPD Recruitment Conference on 20 February will focus on the pros and cons of technology for in-house teams – and how to balance the pressures of being strategic with the day-to-day practicalities of recruitment

Choose from a range of resourcing, talent-planning and recruitment courses and qualifications from CIPD Training, to help you attract, select and retain the best people for your organisation

Explore related articles