The Welsh High Court has found the use of automatic facial recognition by police to scan for people in a crowd lawful.
Since April 2017, South Wales Police have trialled automatic facial recognition (AFR) prior to it being rolled out nationally. This case concerned the pilot of AFR Locate, which extracts facial biometric information from live CCTV feeds and compares it with the biometric data of people on a watchlist. The watchlist might include people wanted on warrants, who are unlawfully at large or who are suspected of having committed crimes. AFR Locate has been deployed 50 times, and involves cameras being mounted on vehicles or poles. It is not a form of covert surveillance, therefore the Police aim to inform members of the public when AFR Locate is being used.
The case was taken by Ed Bridges, a former Liberal Democrat councillor, who noticed the cameras both while on lunch and at a peaceful protest against the arms trade. Mr. Bridges believed the use of AFR technology to be in breach of the right to privacy, data protection law, and anti-discrimination laws. He was represented by Liberty in what is the world’s first challenge to police use of the mass surveillance tool.
The High Court found that while AFR Locate did amount to an interference with privacy rights, its use was proportionate and lawful. The Court was satisfied that there was a clear and sufficient legal framework governing when and how the system may be used. The Court highlighted that each time the technology was deployed, it was for a limited time and for specific and limited purposes. The Court also noted that all personal data was immediately deleted unless there was a match on the watchlist. The Court was of the view that the collecting and processing of personal data complied with requirements set out in the Data Protection Act 2018 as applied to law enforcement authorities.
Finally, the Court dismissed the argument that the surveillance breached the Public Sector Duty under section 149 of the Equality Act 2010. It found there was insufficient evidence that the Police did not consider that the technology might produce indirectly discriminatory results due to a higher proportion of false positive matches for faces who were female or of an ethnic minority.
Click here for the decision in full.