What is facial recognition technology and why is it controversial?

The High Court is set to rule on the first legal challenge over police use of facial recognition technology in the UK.

Human rights campaigners backed the landmark court action after several police forces in England and Wales trialled the controversial technology.

Ed Bridges, from Cardiff, was represented by civil rights group Liberty at a three-day hearing in May as he challenged the use of automatic facial recognition (AFR) by South Wales Police.

Ed Bridges
Ed Bridges

Here’s how facial recognition works – and why it is divisive.

– How does it work?

Technology trialled by the Metropolitan Police in London uses special cameras to scan the structure of faces in a crowd of people.

The system then creates a digital image and compares the result against a “watch list” made up of pictures of people who have been taken into police custody.

Police at Notting Hill Carnival in 2017
Police at Notting Hill Carnival in 2017

Not everybody on police watch lists are wanted for the purposes of arrest – they can include missing people and other persons of interest.

If a match is found, officers in the area of the cameras are alerted.

– How long has it been used?

The Met has used the technology several times since 2016, according to its website, including at Notting Hill Carnival in 2016 and 2017, Remembrance Day in 2017 and Port of Hull docks assisting Humberside Police last year.

South Wales Police piloted its technology during the week of the 2017 Champions League final in Cardiff, making it the first UK force to use it at a large sporting event.

Police outside the National Stadium in Cardiff
Police outside the National Stadium in Cardiff

Facial recognition has also been used on a number of privately owned UK sites, including in shopping centres, museums and conference centres, according to a recent investigation by civil liberties group Big Brother Watch.

In August, the Information Commissioner’s Office announced it was launching an investigation into the use of facial recognition cameras in the King’s Cross area of London following a Financial Times report about their use on the 67-acre King’s Cross Central site.

This week, the site’s developer, the King’s Cross Central Limited Partnership, said it had two facial recognition cameras in operation at King’s Boulevard, used between May 2016 and March 2018, and that data processed by the system during that time was “regularly deleted”.

– Why is it controversial?

Campaigners say facial recognition breaches citizens’ human rights.

Liberty says scanning and storing biometric data “as we go about our lives is a gross violation of privacy”.

#FacialRecognition is being used by police and private companies in public places. It’s use is discriminatory and authoritarian. We need to ban it.

Sign the petition now 👇👇👇https://t.co/qrRniBlpCL

— Liberty (@libertyhq) August 28, 2019

Big Brother Watch says “the notion of live facial recognition turning citizens into walking ID cards is chilling”.

Some campaigners claim the technology will deter people from expressing views in public or going to peaceful protests.

It is also claimed that facial recognition can be unreliable, and is least accurate when it attempts to identify black people and women.

Before the hearing in May, Liberty said freedom of information requests had shown that South Wales Police’s use of AFR “resulted in ‘true matches’ with less than 9% accuracy” in the first year.

– What do the police say?

The Met says its trials, which ended in July, aimed to discover whether the technology was an effective way to “deter and prevent crime and bring to justice wanted criminals”.

Facial recognition channel
Facial recognition channel

“We’re concerned that what we do conforms to the law, but also takes into account ethical concerns and respects human rights,” the force said.

In May, South Wales Police barrister Jeremy Johnson QC told the High Court that AFR “potentially has great utility for the prevention of crime, the apprehension of offenders and the protection of the public”.

He argued that there is “no difference in principle knowing you’re on CCTV and someone looking at it, or that is being done automatically in a millisecond”, adding that the images are deleted 24 hours later if there is no match.

Advertisement