News

Use of automation must play to human strengths and flaws, says Fry

12 Jun 2020 By Francis Churchill

Opening the final day of the virtual CIPD Festival of Work, leading mathematician Dr Hannah Fry warns against blindly following algorithms

Automation and AI technologies need to be developed with humans in mind, a leading mathematician told delegates at the CIPD Festival of Work.

Dr Hannah Fry, associate professor in mathematics of cities at University College London, said that when using data and algorithms to build tools that affect real people, “you cannot just create something, put it on a shelf and decide whether you think it’s good or bad in isolation.

“You absolutely have to think about how what you’re creating will fit into the world around it. You have to think about how humans will use it, and you have to think about the longer-term implications of what you’re doing,” she said.



Opening the third and final day of the online conference, Fry said there was often a rhetoric of ‘humans versus machines’, as though it might become possible one day to just swap one for the other in any given role. But, she said, the real challenge was to create algorithms that supported human decision-making instead of replacing it.

One controversial area where this was already being done was the USA’s judicial system, she reported, where in some cases algorithms that analyse the results of a questionnaire taken by defendants were used to help judges decide on sentencing. When asked, most people said they would rather have a human judge pass sentencing, said Fry. “[But] it turns out that actually there is an incredible amount of luck and bias in the judicial system,” she said.

She pointed to studies showing different judges often give different responses to the same case; that the same judge will give a different verdict to the same case on a different day; and that judges don’t like to give the same response too many times in a row. “So if three bail cases before you have been successful, your chances of a successful hearing drop off a cliff,” said Fry. 


Get more HR and employment law news like this delivered straight to your inbox every day – sign up to People Management’s PM Daily newsletter


“My favourite is that judges tend to be a lot stricter in towns where the local sports team has lost recently, which I think demonstrates the kind of mess of luck and inconsistency that you’re really up against,” she said.

There was a place for algorithms to help improve consistency in situations like this, said Fry. But humans needed to overrule algorithms when they made mistakes, she added: “While the algorithms we have now are spectacularly good at a lot of things, they are not perfect. They don’t understand context, and they don’t understand nuance… And when you add to that the fact that humans are very inclined to take cognitive shortcuts whenever they can, that can end up being a bit of a recipe for disaster.”

Fry warned that when a process or a machine “gets dressed up as an algorithm or AI, it can end up taking on an air of authority that makes it hard to argue with”, giving the real-life example of a group of tourists unquestioningly following a sat nav and driving a car into the ocean.

“We are too trusting… we’re also selfish, we forget stuff, we like to take the easy way out. And when we think about automation, the way that we approach automation, it has to acknowledge that fact,” said Fry. “It has to acknowledge those human flaws.”

Associate Director of HR  - Organisational Development

Associate Director of HR - Organisational Development

London (Central), London (Greater)

£65,000 - £70,000

Brunel University

Head of HR - Part-Time

Head of HR - Part-Time

Chichester, England

Up to £40000 per annum

CONSULT HR

Strategic Business Partner, People & Culture

Strategic Business Partner, People & Culture

Our Global Hub Headquarters is based in Woking, Surrey, UK. Other bases/locations can be considered

£58,000 - £64,000

Morgan Law - Client Branded

View More Jobs

Explore related articles