George Harrison
spiked.com
spiked.com
Automated facial recognition (AFR) is the state’s latest, and most invasive, surveillance technology.
Since 2015, three police forces – South Wales, Metropolitan and
Leicestershire – have made use of AFR in controversial live trials. Now,
South Wales Police have been taken to court by office worker Ed
Bridges, who started a crowdfunding campaign when he felt his privacy
had been violated by AFR.
Bridges’ legal challenge has been backed by civil-rights organisation Liberty, which argues that the indiscriminate deployment of AFR is equivalent to taking DNA samples or fingerprints without consent. According to Liberty, there are no legal grounds for scanning thousands of innocent people in this way. It also claims the technology discriminates against black people, whose faces are disproportionately flagged by mistake, meaning they are more likely to be stopped by police unfairly.
Bridges’ legal challenge has been backed by civil-rights organisation Liberty, which argues that the indiscriminate deployment of AFR is equivalent to taking DNA samples or fingerprints without consent. According to Liberty, there are no legal grounds for scanning thousands of innocent people in this way. It also claims the technology discriminates against black people, whose faces are disproportionately flagged by mistake, meaning they are more likely to be stopped by police unfairly.
In London,
AFR has been put on hold while the Metropolitan Police carries out a
review. The Met is also facing a legal challenge of its own from Big
Brother Watch, another civil-liberties group. Director Silkie Carlo, a
vocal critic of AFR since its inception, told spiked: ‘People
are right to be concerned when they can see us moving towards a police
state. The result of this technology is that the normal relationship
between innocence and suspicion has been inverted.’
One camera, placed in a busy, inner-city location, can scan the faces of up to 18,000 pedestrians per minute, automatically logging the features of anyone unlucky enough to walk past. A computer immediately checks these faces against a database of wanted mugs and lets nearby officers know if there’s a match.
One camera, placed in a busy, inner-city location, can scan the faces of up to 18,000 pedestrians per minute, automatically logging the features of anyone unlucky enough to walk past. A computer immediately checks these faces against a database of wanted mugs and lets nearby officers know if there’s a match.
I have previously warned on spiked against
the illiberal use of this technology, and the flaws inherent in AFR
policing have since become even clearer. Around 50 deployments have
taken place so far in Wales alone, including during the Champions League
final in Cardiff in June 2017. On that occasion, the cameras scanned
2,470 people – 92 per cent of whom were wrongly identified as criminals.
The trials do not exactly inspire confidence in the accuracy of this technology. But even if AFR worked perfectly, its use would still violate our right to privacy and turn us all into suspects. In previous live AFR trials, it was unclear what would happen to members of the public who refuse to be scanned. Well, now we know: anyone who doesn’t consent to being turned into a walking ID-card will be treated like a criminal.
Read more
The trials do not exactly inspire confidence in the accuracy of this technology. But even if AFR worked perfectly, its use would still violate our right to privacy and turn us all into suspects. In previous live AFR trials, it was unclear what would happen to members of the public who refuse to be scanned. Well, now we know: anyone who doesn’t consent to being turned into a walking ID-card will be treated like a criminal.
Read more
No comments:
Post a Comment