Search

Court finds some fault with UK police force’s use of facial recognition tech - TechCrunch

serongyu.blogspot.com

Civil rights campaigners in the UK have won a legal challenge to South Wales Police’s (SWP) use of facial recognition technology. The win on appeal is being hailed as a “world-first” victory in the fight against the use of an “oppressive surveillance tool”, as human rights group Liberty puts it.

However the police force does not intend to appeal the ruling — and has said it remains committed to “careful” use of the tech.

The back story here is SWP has been trialing automated facial recognition (AFR) technology since 2017, deploying a system known as AFR Locate on around 50 occasions between May 2017 and April 2019 at a variety of public events in Wales.

The force has used the technology in conjunction with watchlists of between 400-800 people — which included persons wanted on warrants; persons who had escaped from custody; persons suspected of having committed crimes; persons who may be in need of protection; vulnerable persons; persons of possible interest to it for intelligence purposes; and persons whose presence at a particular event causes particular concern, per a press summary issued by the appeals court.

A challenge was brought to SWP’s use of AFR by a Cardiff-based civil liberties campaigner, called Edward Bridges, with support from Liberty. Bridges was in the vicinity of two deployments of AFR Locate — first on December 21, 2017 in Cardiff city centre and again on March 27, 2018 at the Defence Procurement, Research, Technology and Exportability Exhibition taking place in the city — and while he was not himself included on a force watchlist he contends that given his proximity to the cameras his image was recorded by the system, even if deleted almost immediately after.

The human rights implications of warrantless processing of sensitive personal data by the police is the core issue in the case. The issue of bias risks that can flow from automating identity decisions is another key consideration.

Bridges initially brought a claim for judicial review on the basis that AFR was not compatible with the right to respect for private life under Article 8 of the European Convention on Human Rights, data protection legislation, and the Public Sector Equality Duty (“PSED”) under section 149 of the Equality Act 2010.

The divisional court dismissed his appeal on all grounds last September. He then appealed on five grounds — and has succeeded on three under today’s unanimous court of appeal decision.

The court judged that the legal framework and policies used by SWP did not provide clear guidance on where AFR Locate could be used and who could be put on a watchlist — finding too broad a discretion was afforded to police officers to meet the standard required by Article 8(2) of the European Convention on Human Rights.

It also found that an inadequate data protection impact assessment was carried out, given SWP had written the document on the basis of no infringement of Article 8, meaning the force had failed to comply with the UK’s Data Protection Act 2018.

The court also judged the force wrong to hold that it had complied with the PSED — because it had not taken reasonable steps to make enquiries about whether the AFR Locate software contained bias on racial or sex grounds. (Though the court noted there was no clear evidence the tool was so biased.)

Since Bridges brought the challenge London’s Met police has gone ahead and switched on operational use of facial recognition technology — flipping the switch at the start of this year. Although in its case a private company (NEC) is operating the system.

At the time of the Met announcement, Liberty branded the move “dangerous, oppressive and completely unjustified”. In a press release today it suggests the Met deployment may be unlawful for similar reasons as the SWP’s use of the tech — citing a review the force carried out. Civil liberties campaigners, AI ethicists and privacy experts have all accused the Met of ignoring the findings of an independent report which concluded it had failed to consider human rights impacts.

Commenting on today’s appeals court ruling in a statement, Liberty lawyer Megan Goulding said: “This judgment is a major victory in the fight against discriminatory and oppressive facial recognition. The Court has agreed that this dystopian surveillance tool violates our rights and threatens our liberties. Facial recognition discriminates against people of colour, and it is absolutely right that the Court found that South Wales Police had failed in their duty to investigate and avoid discrimination.

“It is time for the Government to recognise the serious dangers of this intrusive technology. Facial recognition is a threat to our freedom — it needs to be banned.”

In another supporting statement, Bridges added: “I’m delighted that the Court has agreed that facial recognition clearly threatens our rights. This technology is an intrusive and discriminatory mass surveillance tool. For three years now South Wales Police has been using it against hundreds of thousands of us, without our consent and often without our knowledge. We should all be able to use our public spaces without being subjected to oppressive surveillance.”

However it’s important to note that he did not win his appeal on all grounds.

Notably the court held that the earlier court had correctly conducted a weighing exercise to determine whether the police force’s use of AFR was a proportionate interference with human rights law, when it considered “the actual and anticipated benefits” of AFR Locate vs the impact of the AFR deployment on Bridges — and decided that the benefits were potentially great, while the individual impact was minor, hence holding that the use of AFR was proportionate under Article 8(2).

So the UK court does not appear to have closed the door on police use of facial recognition technology entirely.

Indeed, it’s signalled that individual rights impacts can be balanced against a ‘greater good’ potential benefit — so the ruling looks more like it’s defining how such intrusive technology can be used lawfully. (And it’s notable that SWP has said it’s “completely committed” to the “careful development and deployment” of AFR, via BBC.)

The ruling does make it clear that any such deployments need to be more tightly bounded than the SWP application to comply with human rights law. But it has not said police use of facial recognition is inherently unlawful.

Forces also cannot ignore equality requirements by making use of such technology — there’s an obligation, per the ruling, to take steps to assess whether automated facial recognition carries a risk of bias.

Given bias problems that have been identified with such systems that may prove the bigger blocker to continued police use of this flavor of AI.

Let's block ads! (Why?)



"use" - Google News
August 11, 2020 at 06:42PM
https://ift.tt/31GPFyP

Court finds some fault with UK police force’s use of facial recognition tech - TechCrunch
"use" - Google News
https://ift.tt/2P05tHQ
https://ift.tt/2YCP29R

Bagikan Berita Ini

0 Response to "Court finds some fault with UK police force’s use of facial recognition tech - TechCrunch"

Post a Comment

Powered by Blogger.