technology

Facial recognition leaves Brits in ‘digital police line-ups’ without realising


The use of facial recognition in public has drawn criticism from many (Picture: Matthew Horwood/Getty)

You’re out shopping, minding your own business, when suddenly the security guard appears and declares you’re a criminal.

That is the exact scenario one woman found herself in due to a mistake made by facial recognition software in a Home Bargains store.

And as the technology is being rolled out more and more, everyone is at risk of being part of a ‘digital police line-up’ without their knowledge, according to the head of a civil liberties charity.

In the case of Sara, who wants to remain anonymous, she was approached by a store worker who said ‘You’re a thief, you need to leave the store’. Her bag was searched before she was led out of the store.

Speaking to the BBC, she said: ‘I was just crying and crying the entire journey home… I thought, “Oh, will my life be the same? I’m going to be looked at as a shoplifter when I’ve never stolen”.’

Sara had been mistakenly identified by facial recognition software called Facewatch, which is used in a number of stores around the UK alongside Home Bargains, including Sports Direct, Budgens and Costcutter.

While Facewatch did write to Sara and acknowledge the error, it declined to comment to the BBC. Home Bargains also declined to comment.

The case is just one of a growing number in which innocent people are accused of crimes they did not commit.

Facial recognition technology was first used by the Metropolitan Police at the Notting Hill Carnival (Picture: Jack Taylor/Getty)

Since the Notting Hill Carnival in 2016, the Metropolitan Police has used live facial recognition (LFR) technology both at major events and as part of general surveillance on streets. 

In January, the force revealed it had arrested more than a dozen people after deploying LFR over two days in Croydon, including those who had failed to appear in court and a man wanted on a prison recall.

The BBC also joined the Met during an LFR deployment, during which six arrests were made, including two people who had breached the terms of their sexual-harm prevention orders.

Earlier this month, a post on X by Tower Hamlets Police alerted residents to use of the software in the borough.

Readers Also Like:  Bengaluru fresher hiring sentiment down by 10%, hit by slump in IT sector

‘We’ll be using Live Facial Recognition technology at key locations in Tower Hamlets today (15 May),’ it said.

‘This technology helps keep Londoners safe and will be used to find people who threaten or cause harm, those who are wanted, or have outstanding arrest warrants issued by the court.’

In total, the Met has made 192 arrests this year as a result of the technology.

A van used by the Metropolitan Police as part of its live facial recognition surveillance (Picture: Getty)

However, just as in the case of Facewatch, the Met’s software, provided by NEC, is not infallible either.

Shaun Thompson, who works for youth-advocacy group Streetfathers, was mistakenly identified when passing a LFR van near London Bridge in February.

‘I got a nudge on the shoulder, saying at that time I’m wanted,’ Shaun told the BBC.

He was held for 20 minutes and asked to give fingerprints, only being released after handing over a copy of his passport.

‘It felt intrusive… I was treated guilty until proven innocent,’ he said.

The BBC said it understands the case may have been due to family resemblance.



How does live facial recognition work?

Facial recognition starts by identifying face in a still or video – picking out which pixels make up a face and which are the body, background or something else.

It then maps the face, such as measuring the distance between certain features, to create a ‘numerical expression’ for an individual.

This can then be quickly compared to large databases to try to find a match from faces that have already been mapped.

However, the cases of mistaken identity serve as a stark reminder that anyone passing a vehicle operating LFR will have their face scanned, most likely without their knowledge.

Readers Also Like:  Microsoft's Satya Nadella 'comfortable' with OpenAI non-profit despite Altman ouster

Silkie Carlo, director of Big Brother Watch, has been observing facial recognition deployments for years.

‘My experience [is that] most members of the public don’t really know what live facial recognition is,’ she said, speaking to the BBC, adding that on being scanned, they are effectively part of a digital police line-up.

‘If they trigger a match alert, then the police will come in, possibly detain them and question them and ask them to prove their innocence.’

While the Metropolitan Police says that only around one in every 33,000 people surveilled by its cameras are misidentified, once someone is actually flagged, the figure soars, with one in 40 alerts this year cases of mistaken identity.

South Wales Police also uses the technology (Picture: Matthew Horwood/Getty)

Facial recognition software, as with all technology, is constantly evolving, and has taken decades to reach a point where UK authorities are confident enough in its abilities to use it in law enforcement.

Many, however, would argue it is still not accurate enough to be deployed in public, putting aside the privacy argument.

Historically, facial recognition has shown significant racial bias, while studies have also shown issues identifying women, although a study last year by the National Physical Laboratory (NPL) found algorithms used by the UK forces did not discriminate based on gender, age or ethnicity. 

Alongside the Met, South Wales Police uses facial recognition technology, including as major events – it will be deployed at a Take That concert next month.

The force has a clear list of upcoming deployments, allowing the public to see when and where their faces may be scanned. The Met does not appear to offer an easily-accessible calendar of surveillance.

As yet, there are no rules dictating whether those using LFR should make people aware their faces are being scanned.

Readers Also Like:  Mystery of ‘alien song’ coming from one of Earth’s deepest points finally solved

Michael Birtwhistle, head of research at the Ada Lovelace Institute research group, told the BBC that laws have not yet caught up with the technology.

‘I think it absolutely is a Wild West at the moment,’ he said. ‘That’s what creates this legal uncertainty as to whether current uses are unlawful or not.’

For now, the Metropolitan and South Wales Police remain the only forces regularly using the technology. But with regular positive reports on the number of arrests made, it is not hard to imagine their use will increase.

In China, facial recognition software has logged almost every citizen in the country, and uses a vast network of cameras to catch its residents committing even the smallest misdemeanours, from jay walking to using too much toilet paper in a public restroom.

While these examples seem unlikely to appear in the UK, Big Brother Watch’s Ms Carlo warns the country should not be complacent in allowing the unchecked spread of LFR.

‘Once the police can say this is okay, this is something that we can do routinely, why not put it into the fixed-camera networks?’ she said.

The debate is not one-sided however. The use of facial recognition has support from some members of the public who are willing to give up some of their own privacy in exchange for safer streets.

Balancing privacy and public safety is not a new issue for the police, but one that 21st-century technology makes considerably more complex.


MORE : I loved that he was a police officer until he abused that power


MORE : Alexa, can you help me with my child’s homework?


MORE : Google’s AI Overview is already becoming a little unhinged





READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.