England and Wales Court of Appeal finds use of facial recognition technology in breach of privacy rights, data protection and equality law

The Court of Appeal for England and Wales has ruled that the South Wales Police Force’s use of facial recognition technology to compare digital images of members of the public with persons on the police ‘watchlist’ was unlawful and in breach of privacy rights, data protection and equality law. 

In 2015, the South Wales Police Force received a license to pilot the use of automated facial recognition technology. The technology was being used by police in large crowds to identify individuals who were “wanted on suspicion for an offence, wanted on warrant, vulnerable persons and other persons where intelligence [was] required”. While the technology was supposed to search for faces on the persons of interest list, it collected biometric data indiscriminately.

The appellant, Edward Bridges, had been captured on CCTV twice; once while shopping in December 2017 and again while at a protest in March 2018. Bridges contended that he did not see signs that the facial recognition technology was being used in the area at the time. Mr Bridges brought judicial review proceedings claiming that the facial recognition technology was not compatible with the right to respect for private life under Article 8, and in breach of the Data Protection Act 1998 and the Public Sector Equality Duty. In September 2019, the High Court dismissed Mr Bridges’ claim, finding that any interference with his privacy rights was proportionate and in accordance with law.

While the Court of Appeal agreed that the use of facial recognition technology was a proportionate interference with human rights, it found that there was an insufficient legal framework to make the violation of privacy rights under Article 8 of the European Convention on Human Rights “in accordance with law”. The Court stated that the technology was deficient in part because it did not address the processing of biometric data relating to people who were not on the ‘watchlist’ that and it also gave too much discretion to the system operator as to who should go on the watchlist. 

The Court also found that the Data Protection Impact Assessment that was carried out by the Police failed to properly assess the risks to the rights and freedoms of data subjects and failed to address the risks arising from the deficiencies. 

The Court agreed with Bridges that the Public Sector Equality Duty had been breached as the Police, “never sought to satisfy themselves, either directly or by way of independent verification, that the software program in this case does not have an unacceptable bias on grounds of race or sex.” While the Court did not contend that there was bias within the technology used, the mere risk of indirect discrimination created a duty to investigate where it may arise.

The Court granted declaratory relief to reflect its findings and confirmed that UK law enforcement agencies have specific obligations to meet in order to use automated facial recognition.  

Click here to read the full decision. 

Click here for a previous PILA Bulletin article on the case.

 

Share

Resources

Sustaining Partners