Are your HR techniques posing a privacy threat to your workforce? And when does monitoring your workforce become data infringement? For the 71 per cent of companies seeking to implement people analytics tools into their people management practices, knowing the answers to these questions is critical. Because while people data can benefit both employer and employee, the issues that this kind of analysis raises around data privacy and infringement of rights cannot be ignored.
A world in which job interviews are conducted by algorithms, employee mood is monitored by emails, and performance is predicted by data is no longer the stuff of science fiction. Businesses are increasingly turning to innovative technologies to collect, process and interpret people data, with multiple benefits. For example, employers can use people analytics tools to analyse voices and determine the honesty of interview candidates, or call on data to easily match employees to roles based on their skills. The insights that artificial intelligence and data sets can provide to employers may afford the workforce a more supportive work environment, the best team of colleagues and increased job satisfaction.
The benefits of the technology to business should not be underestimated – yet the legal risks associated with people analytics too often are.
The data privacy landscape has been significantly enhanced by the coming into force of the EU General Data Protection Regulation (GDPR) back in May. The regulation prohibits the taking of any decision based ‘solely’ on automated processing, including profiling, which produces ‘legal effects’ concerning an individual or ‘similarly significantly’ affects them. Human involvement is therefore essential in cases where businesses wish to utilise people analytics tools to make decisions which will significantly impact an individual, such as denying them an employment opportunity, changing their role or dismissing them.
GDPR may also impose hefty fines on employers that fail, when introducing any form of people analytics, to carry out an assessment of the impact of the processing operations on the protection of the personal data of their employees.
Even outside of the issues raised by GDPR, there a number of other considerations. There are, for example, growing concerns that artificial intelligence algorithms used to analyse people data could reinforce discrimination. Indeed, such tools in action have demonstrated a susceptibility to gender bias. Employers must constantly monitor algorithms to ensure that they do not operate in a discriminatory way, or risk facing and having to answer to claims – for which there is no cap on awards in the UK.
Businesses looking to implement these technologies should also look to trade union bodies for guidance on appropriate use, particularly where artificial intelligence is concerned. UNI Global Union has issued the ‘top 10 principles for ethical artificial intelligence’ which call for businesses looking to implement people analytics practices to hold thorough and transparent dialogues with workforces to allay fears, establish trust and ensure employees are aware of their rights.
The advent of people analytics brings with it many exciting opportunities. It is perfectly possible for businesses to harness this technology beneficially and without legal risk, so long as the right people are trained and supported to make ethical use of the insights produced. People analytics tools should never be intended to replace HR or to make decisions on its behalf, but to facilitate decision-making by identifying wider trends.
The most important principle to remember is that the right mix of people and technology are essential when it comes to making ‘people’ decisions, and particularly those which affect individuals. When it comes to HR, it should never simply be a case of ‘computer says no’.
Caroline Stroud is a partner and head of people and reward practice at Freshfields Bruckhaus Deringer