This is Big Brother… you have been detected near the tube stop!

Campaigners say police use of facial recognition is ‘Orwellian mass surveillance’

Friday, 27th October 2023 — By Frankie Lister-Fell

Highbury cameras

The cameras have previously been used outside Higbury and Islington underground station



POLICE are deploying an “Orwellian mass surveillance tool”, campaigners have warned.

It has been set up outside Highbury and Islington tube station and has also been used in Camden Town.

There are two types of facial recognition used by the police: retrospective recognition, where images of someone suspected of doing something illegal are compared to custody images databases, and LFR.

LFR operates from police vans that are deployed in a specific area. Signs are displayed at the location to say that facial recognition is taking place.

The cameras scan anyone who walks past. Every face is mapped and converted into a “biometric face print”, similar to fingerprints, that is checked against a police watchlist looking for matches – without consent from those walking by.

Madeleine Stone, senior advocacy officer at Big Brother Watch, told the Tribune: “It’s the same thing as having your fingerprint taken by the police in order to walk down the street.”

Sir Mark Rowley

The Met say the technology is used to “locate dangerous individuals”.

Last week, commissioner Sir Mark Rowley said the force’s new plans to use retrospective facial recognition to identify shoplifters “was pushing boundaries”. But human rights organisations have said it’s discriminatory, inaccurate and a “big privacy concern”.

Ms Stone said: “Police targeting residents with this dangerous mass surveillance tool treats them like suspects, erodes public freedoms and wastes public money.

“Live facial recognition is not an efficient crime-fighting tool, with the police’s own statistics revealing that more than 8 out of 10 facial recognition matches have been inaccurate since its introduction.

“This is an Orwellian mass surveillance tool rarely seen outside of Russia and China and has absolutely no place in London.”

She said there was an algorithmic bias, where studies have shown live facial recognition is “less accurate for women and people of colour”.

And certain communities are over-policed and more likely to end up on watchlists, which means they’re more likely to be flagged by the technology wrongly or rightly.

Cllr Caroline Russell

Green councillor Caroline Russell, chair of London Assembly’s police committee, said: “The lack of transparency about the make-up of watchlists and the purpose of deployments is unhelpful in the context of the Met trying to improve Londoners’ trust and confidence in policing.

“LFR is a really dangerous technology and it’s very difficult to see how it’s actually making a difference in terms of policing.”

A Met spokesperson said: “The Met has primarily focused the use of LFR on the most serious crimes; locating people wanted for violent offences, including knife and gun crime, or those with outstanding warrants who are proving hard to find.”

Human rights organisation Liberty has launched a petition to the home office to ban facial recognition technology.

Emmanuelle Andrews, Liberty policy and campaigns manager, said: “We cannot police and surveil our way out of social problems, and facial recognition is the wrong response. The safest thing to do for everyone is ban facial recognition technology.”



 

Related Articles