We’ve allowed artificial intelligence into our lives almost without even realising it. Alexa (or one of her competitors) will tell us what the weather will be like tomorrow from a barely distinct command. Netflix’s algorithms suggest what we might like to watch based on viewing habits. Health apps are crunching data and drawing conclusions about our wellbeing as we go about our business.
At work, it’s a similar story. The CIPD’s April 2019 report, People and Machines: From Hype to Reality, found that – despite the technology being in its relative infancy – 32 per cent of UK organisations had already invested in AI and automation, with 22 per cent introducing software to perform cognitive tasks.
Many HR departments are already using text-based chatbots of the kind deployed by banks or retailers to handle customer service queries by mimicking natural conversations. They’re a neat replacement for elements of employee self-service and free up HR advisors for value-added work.
But the next frontier of AI may be more controversial: ‘chatbot coaches’ are conversational AI programmes that claim to be able to perform, to varied extents, the role of a business coach, evaluating surveys and performance reports and suggesting training for individuals to progress their personal development, as well as offering more general advice.
The advantage for individuals or employers in using a chatbot coach is not just about cost (though this is a considerable factor); there is also evidence that some people prefer the experience of speaking to a machine, believing it to be less judgmental and the process more efficient.
Major players include IBM’s Watson Career Coach, which uses CV data and interactions to learn an employee’s preferences and suggest appropriate development resources and vacancies within the organisation, or start-ups such as Saberr’s Coachbot, which facilitates team meetings and keeps action logs to encourage interaction, feedback and development.
Butterfly.ai is a platform specifically designed to drive conversation between managers and remote workers using pulse surveys and feedback, while Pocket Confidant – rather than offering advice – guides individuals who are ‘stuck’ to process their thoughts and come up with a solution.
All chatbot coaches work in different ways, but their key interaction – as with their human equivalents – is in asking or answering questions. Some can tackle simple queries such as “Are there any other opportunities for me in the company?” by pointing towards suitable resources. Others pose open-ended questions about career development (“Tell me how your role is going”) and respond with helpful links or prompts to speak to line managers or HR. Team coaches, meanwhile, can get groups together virtually and ensure they are following up on previously defined goals, sharing information between participants.
Whether they’re created by a global IT conglomerate or a Silicon Roundabout entrepreneur, one major attraction of chatbot coaches is their on-demand nature, with users able to simply pick up their smartphone whenever they like, rather than schedule and give up time for regular face-to-face meetings with a coach.
In a study presented at the 2017 International Conference of Internet Science, 42 per cent of chatbot coach users reported the main benefits were ease of use, speed and convenience. “In L&D, there’s an increasing move to content that’s used by people when it’s convenient for them,” adds Kevin Lyons, senior HR manager at Pearson. “We all have busy lives and busy schedules. Chatbots have a role to play in that.”
Convenience was the main driver behind Pearson’s recent pilot of one chatbot coaching platform, says Lyons, who wanted to move away from having a team of people providing purely classroom-based sessions. However, the trial wasn’t successful, and Lyons has gone back to the drawing board. “The chatbot wasn’t able to understand the responses I’d given, or suggest much to me beyond generic links and content,” he says, adding that in order to achieve the level of intelligence required, the technology needed a great deal more time and investment.
Other businesses are more optimistic. Dyson, Estée Lauder and Intel have used coaching chatbots and they are being trialled across parts of the NHS. And the rise of chatbots in other areas – for example, offering cognitive behaviour therapy (CBT) for people experiencing mental ill-health – suggests the technology can be a preferable alternative for people who may feel uncomfortable speaking to a human.
A 2017 systematic review by academics at the Universities of California and Sydney of evidence that web-based chat software is effective in treating mental illness found “positive post-intervention gains”. Equally, a study of one chatbot’s effectiveness in providing therapy to young people with symptoms of anxiety and depression, published in the journal JMIR Mental Health, also found it a “feasible, engaging, and effective way to deliver CBT”.
But not everyone is convinced that AI is a viable alternative to working with a human coach in the more nuanced business environment. “Chatbots have a role to play, but they can never do exactly what a coach does because they’re not a person,” says Dr Rebecca Jones, associate professor in coaching and behaviour change at Henley Business School. “It’s an effective method of using goal setting to change your behaviour, but that on its own isn’t coaching.”
Jones also highlights that just because someone may feel more comfortable conversing with a chatbot rather than a human coach doesn’t mean it will necessarily help them more. “Working with a coach can be uncomfortable,” she says. “It can be confronting and you have to challenge things you don’t really like about yourself – but that’s where the biggest shift can occur.”
Professor David Clutterbuck, author, consultant and special ambassador to the European Mentoring and Coaching Council, agrees that chatbot coaches will never quite match up to the skills and expertise of a human. “Effective coaches bring wisdom based on reflection and experience – something coachbots are unlikely ever to achieve,” he says.
But AI’s potential to match the capabilities of a human coach is far from the only concern. “A particularly hot issue around these sorts of technologies is the security, privacy and robustness of the system,” says Rob McCargow, director of AI at PwC. “People could be sharing their innermost thoughts and secrets with this technology, so how is the data stored and secured? Who’s got access to it? All that is critical to get right.”
As well as security, McCargow highlights the importance of the “explainability” of AI systems. With some bots already appearing to have high levels of autonomy – Amazon’s algorithm can reportedly sack warehouse workers who continually fail to meet productivity targets – being able to understand the reasoning behind its decisions is critical. “With the most powerful forms of AI, even the people who build them can’t necessarily explain how a decision is made,” says McCargow. “You’d want to be able to account for that.”
Despite the legitimate concerns around AI’s ethics, experts maintain the technology is still in its very early stages – which means we have an opportunity to get it right before it develops further. Clutterbuck highlights how coaching chatbots have only made small inroads into businesses so far, largely because the algorithms lack the ability to cope with “non-routine situations”. Making sure systems are fit for purpose in the early days will ensure strong adoption by the workforce in future, adds McCargow. “If we see any poor practice being applied to this, the workforce’s trust in these systems could be set back.”
But will the technology ever take over completely from human coaches? Martin Kirke, executive coach and former Post Office group HR director, thinks it may be possible – at least when it comes to coaching individuals. “It’s all a timescale, but at the end point I think they will be capable,” he says. “But less so for group sessions, because analysing the dynamic between people brings another level of complexity.”
Jones is less optimistic, believing an algorithm can never replicate the coach/coachee relationship. “If you have a strong relationship with your coach, you trust them and are more likely to be open and honest,” she says. “You’re never going to have that relationship with a computer, so you’ll always be limited.”
The next logical step, Clutterbuck believes, will be for coachbots to become more capable, but automatically defer to humans when they reach an impasse. Similarly, Kirke expects a move towards “augmented” intelligence in coaching, with tech and humans working in harmony and helping improve the quality of the service a person is able to provide.
In the meantime, the key for businesses keen to experiment with a chatbot is to ensure the HR department has a voice in the discussion and is fully involved in the implementation of new technology, rather than remaining a passive bystander.
“All of this plays to a different model of HR,” adds Lyons. “Part of that model is getting more engaged in technology. It’s only going to replace you if you want it to.”