Facial recognition glitch leads police to arrest Asian man for theft 100 miles away facial recognition

by
0 comments
Facial recognition glitch leads police to arrest Asian man for theft 100 miles away facial recognition

Police arrested a man accused of burglary in a city he had never visited after face scanning software deployed across Britain confused him with another man of South Asian heritage.

Alvi Chaudhary, 26, a software engineer, was working out at his home with his parents in Southampton in January when police knocked on his door, handcuffed him and detained him for about 10 hours before releasing him at 2 p.m.

Thames Valley Police used automated facial recognition software to match him to footage of a suspect in a £3,000 theft 100 miles away in Milton Keynes, according to documents shared with the Guardian by Liberty Investigates.

But CCTV footage shows a young man with different features apart from similar curly hair, Chaudhary said, adding that there is confusion as to why he has been arrested.

“I was very angry because the child looked about 10 years younger than me,” said Chaudhary, who has a beard. “Everything was different. The skin was lighter. The suspect looked to be 18 years old. His nose was bigger. There was no hair on his face. His eyes were different. His lips were smaller than mine.

“I just assumed that the investigating officer saw that I was a brown man with curly hair and decided to arrest me.”

UK police forces use algorithms obtained by the Home Office from Cognitec, a German company. It runs approximately 25,000 monthly searches against approximately 19 million police mugshots held on a UK-wide police national database. According to the National Police Chiefs Council, facial matching should not be considered fact, but intelligence. Thames Valley Police said the decision to arrest Chaudhary was also made after a human visual assessment.

But the technology was revealed in December to produce far higher rates of false positives for black (5.5%) and Asian (4.0%) faces than white faces (0.04%) in some settings, according to the Home Office. commission Research. Police and Crime Commissioner caution “With respect to built-in bias”, and stated that although “there is no evidence of adverse effects in any individual case, this is more by luck than by design”.

Since December, Thames Valley Police has also been deploying live facial recognition technology to scan the public at locations in Oxford, Slough, Reading, Wycombe and Milton Keynes. it is Nearly 100,000 faces capturedWhich led to six arrests.

Seeing the difference between the face of the man seen on CCTV and his own face, Chaudhary assumed that he would soon be released. He produced evidence of work meetings in Southampton on the day of the crime but was taken into custody instead.

Choudhary is claiming damages against Thames Valley Police and Hampshire Constabulary, which carried out his arrest. His neighbors saw him being taken away in handcuffs, his father was very worried about being caught, he said, and he was unable to work the next day. He is also demanding greater transparency about the number of wrongful arrests involving facial recognition technology.

Chaudhary’s only blame was on the police system because he was wrongly arrested in 2021 when he was attacked at night at the university in Portsmouth. The police released him without taking any action. Now that he’s had a second mugshot, he fears the automated system could make more false arrests.

“In my mind, if a brown guy in Scotland robbed a bank would they come and arrest me?” He said.

He sometimes required security clearance to work for government clients and was asked about the arrest and said: “It makes me look suspicious and suspicious.”

Thames Valley Police admitted to Choudhury that the arrest “may have been the result of bias within facial recognition technology”. But an official told him that “since the use of facial recognition is already a subject of review at the strategic level, I do not feel the need to raise the issue as part of broader organizational learning”.

A spokesperson for Thames Valley Police described the arrest as unlawful and said: “Although we apologize for the distress caused to the complainant in this case, his arrest was based on the investigating officers’ own visual assessment that the man matched the suspect in CCTV footage following a retrospective facial recognition match, and was not influenced by racial profiling.”

But Choudhury said officers at the Hampshire police station laughed when he asked: “Does it look like me?” And he said that Thames Valley Police officers who arrived to interview him said, “After watching the footage of the suspect and seeing my photograph, they knew I was not the suspect.”

There have been repeated warnings about the use of automated facial recognition technology. The UK’s Biometrics and Surveillance Cameras Commissioner, William Webster, in December 2024 resonant The concern is that police continued to retain and use images of people who were arrested but later never charged or arraigned. Last month, South Wales Police paid A black man was wrongly arrested and detained for 13 hours after facial recognition technology was used.

Chaudhary’s lawyer, Ian Gould, a partner at DPP Law, said police should “ensure that artificial intelligence is not substituted for human intelligence and due diligence, but is used in careful partnership with it”.

The Home Office said guidance and training is being reviewed by the Police Inspectorate to reduce error and maintain public confidence in retrospective facial recognition. It said a new national facial matching system was being developed with an improved, independently tested algorithm.

Related Articles

Leave a Comment