We have been living in the age of machine learning for some time now. However, the coronavirus pandemic is creating the conditions for an accelerated advance. Machine learning refers to technologies that seek to emulate how people learn by using algorithm-driven processes that work with data relevant to a specific problem.
Just like people, the technologies have to be ‘trained’ with data formed from examples and experience to allow decisions to be made leading to action. However, a key difference is the connection to artificial intelligence, robotics, automation and others, which together is termed the fourth industrial revolution. There are many examples of how machine learning has provided new products and services that have improved workplace efficiency (via automated document analysis, for instance), allowed customers to personally benefit (the distribution of pharmaceuticals) and help tackle crucial problems.
There are, of course, some dangers and problems associated with machine learning. Even before the Covid-19 outbreak, there were concerns about how it would affect the need for human skills and the shape of work. That is, humans could be displaced – a concern with a long history. During the current time, as we head towards a ‘new normal’, we can already see how work is being recast by automation and the virtualisation of organisations.
Against this rapidly evolving context, we were interested in the extent to which those professionals with responsibility for people development also took account of machine learning. To find out, in 2019 we interviewed seven senior learning and development professionals working in a range of multinational, financial and public sector organisations. We posed questions relating to their acceptance of machine learning and their role in working with such technologies, with the final question: what is the future of the HR profession in a future where machines can learn?
First, we found there was a growing awareness of machine learning and its associated features but there was little connection with their roles. There was a tendency to recognise there could or even ought to be a connection, but the professionals we interviewed were not being proactive. They also knew there could be an impact on their roles – for good or ill. However, most felt protected by their experience and skillset against any encroachment.
Second, there was recognition of how machine learning could be of value in relation to mundane aspects of work, such payroll and recruitment, and how chatbots could be used. However, they were convinced that humans and their skills and experience would continue to be needed.
A further issue, and possibly prompted by our conversations, was that as L&D professionals they were seldom involved in the development stages of machine learning projects – nor did they have the skills or experience to contribute. They were more likely to connect with technical project teams on the basis of what training needs might be identified as the project was implemented. They could see how machine learning had the potential to reshape the work and the skills required by their organisations. Further, they could see how their roles and expertise as facilitators of change for the acceptance of technology might be more prominent.
Importantly, there was recognition that there could be a chance to impact on machine learning projects earlier and to provide a source of critique and ethicality. As L&D professionals, they could provide part of the context for the development of projects, even if they could not write algorithms themselves.
In response, we suggest L&D practitioners make a concerted move towards the front end of projects. To do this, we advocate the need for their development as hybrid professionals working more closely with the experts who develop machine learning technologies. Such knowledge is developed through interactions and is acquired over time through regular contact and involvement. If L&D professionals can develop this knowledge they would be able to raise issues of ethics and human values and find opportunities for their own development and for others.
In other words, as hybrid professionals, they would move from being bystanders to or victims of machine learning, to becoming active and human-centred co-participants.
Jeff Gold is professor of organisation learning at Leeds and York Business Schools, Dr Lynn Nichol is head of the management and finance department at the University of Worcester and principal lecturer in HR, and Dr Patricia Harrison is senior lecturer at Liverpool John Moores University