Police admit AI surveillance panopticon still has issues with “some demographic groups”

by
0 comments
Police admit AI surveillance panopticon still has issues with "some demographic groups"

Illustration by Tag Hartman-Simkins/Futurism. Source: Getty Images

Officials on Thursday pledged to implement A nationwide facial recognition system to help police track down criminals. The country’s ministers have launched a 10-week consultation to analyze the regulatory and privacy framework of its AI-powered surveillance panopticon – but one way or another, the all-seeing eye is on its way.

There’s just one small drawback: AI facial recognition cameras have a tendency to misidentify non-white people.

New reporting by Guardian Note that a test of AI technology conducted by the National Physical Laboratory (NPL) found that,It is also “more likely to inaccurately include certain demographic groups in search results” – particularly Black and Asian people.

Although National Policing Minister Sarah Jones has described facial recognition technology as “the biggest breakthrough in catching criminals since DNA matching”, lower-level police commissioners reported Guardian The NPL’s findings “highlight the underlying bias involved.” He similarly urged caution on the national rollout, which is falling on deaf ears.

According to the NPL analysis, the national “retroactive facial recognition tool” – one of three types of facial recognition software used by the national police – has a “false positive identification rate for white subjects (0.04 percent)”, which is “lower than that for Asian subjects (4.0 percent) and black subjects (5.5 percent).”

“This means that in some circumstances black and Asian people are more likely to be incorrectly matched than their white counterparts,” the Association of Police and Crime Commissioners said in a statement. Guardian“The language is technical, but behind the description it seems clear that the technology has been deployed in operational policing without adequate safeguards,”

What this means for a nationwide rollout remains to be seen. London is already one of the most Cities were heavily monitored on earth, with a guess 1,552 cameras per square mileIn November, the Home Office offered funding to deploy a fleet of seven additional metro police forces facial recognition vanJoining police in London, South Wales and Essex, which have been using the vans for some time.

Each of these vans has been linked to the police watch list, and has AI-powered facial recognition cameras installed on the roof.

As part of the 10-week consultation, the government collect feedback Citizens over whether police using facial recognition systems should be able to cross-reference their watchlists with other databases such as passport and driver license registries. Still, given that ministers had already promised to dramatically expand facial recognition technology, it is unclear how important public opinion will be in this rollout.

If everything goes according to the system’s boosters’ plan, a new national database will be set up, containing “millions of images” of innocent civilians. Guardian,

Charlie Whelton, a policy and campaigns officer at the advocacy group Liberty, said, “The racial bias in these statistics reflects the harmful real-life effects of letting police use facial recognition without proper safeguards.” “The government must stop the rapid implementation of facial recognition technology until these are in place to protect each of us and prioritize our rights – something we know the public wants.”

More on monitoring: The AI ​​startup says it will eliminate crime by keeping the entire United States under the radar of always-watching spy cameras.

Related Articles

Leave a Comment