Big Brother is watching you: Facial recognition in the UK

Big Brother is watching you: Facial recognition in the UK

In a recent Privacy Laws & Business article, data protection partner Katie Hewson, associate Daniel Jones and senior paralegal Nelson Kiu report on the latest, sometimes controversial, developments with live facial recognition.

In an era marked by rapid technological advancements, the integration of live facial recognition (LFR) technology into public surveillance systems has sparked considerable debate. This is especially so in the UK, where LFR has recently made headlines and emerged as a contentious issue at the intersection of privacy rights, civil liberties, and public security concerns. 

What is LFR?

LFR is a technology that enables real- time and automated identification or verification of individuals based on their biometric facial characteristics. Using artificial intelligence to compare captured facial images against existing databases of known faces, LFR enables the swift identification of individuals. However, LFR has the capability to automatically and indiscriminately capture biometric data of individuals passing within the camera’s range.

As a result of its real-time identification and matching of individuals, LFR has been increasingly employed for different surveillance purposes to prevent, for example, unwanted behaviours in public places. LFR could even be used for marketing, targeted advertising or other commercial purposes, using individuals’ biometric data to place them in a particular category. Most notably, LFR has been deployed by law enforcement agencies for purposes such as suspect identification, surveillance at public events, and monitoring high-risk areas for criminal activity.

Regulating LFR in the UK

In January 2024, the UK House of Lords Justice and Home Affairs Committee (“Committee”) released a report1 advocating for the regulation of LFR and its use by the police. The Committee highlighted that the use of LFR by police forces in the UK lacks clear legal foundation, and that there are no rigorous standards of regulation or consistency in training approaches.

In a corresponding letter2 to the UK Home Secretary, the Committee called for the establishment of a legislative framework to regulate LFR, and emphasised the importance of transparent deployment and independent scrutiny. In particular, police forces should communicate to the public on when, where and how LFR is being deployed. The Committee also called for a thorough analysis of the lawfulness, necessity, and proportionality of LFR and its impact on public privacy rights. A key part of the Committee’s concerns stemmed from the King’s Coronation in May 2023, where LFR was deployed, on a scale described by academics as “probably the largest ever seen in Europe”. This signalled the UK government’s support for expanding LFR usage to deter and detect crime in public settings with large crowds.

The new Criminal Justice Bill

In 2023, concerns were raised around the introduction of the new Criminal Justice Bill (currently at the report stage in the House of Commons), which would allow police authorities to deploy LFR to conduct searches on driving licence holders in connection with criminal investigations. The Bill seeks to grant police authorities unlimited access to millions of biometric data points to identify suspects for public safety reasons without proper safeguards.

Big Brother Watch, a privacy rights group, has criticised this Bill, citing concerns over the high rate of inaccuracies around LFR. For instance, it was found that there was an inaccuracy rate of over 86% in relation to the Met and South Wales Police’s facial recognition systems when they were deployed. Such inaccuracies could result in potential wrongful arrests, invasion of privacy, expansion of surveillance, and even the curtailing of democratic rights such as the freedom to peacefully protest.

Challenges with LFR regulation

In November 2023, the former UK Biometrics and Surveillance Camera Commissioner, the watchdog responsible for monitoring facial recognition in the UK, joined Facewatch, a facial recognition technology firm, as a non-executive director. Facewatch uses biometric cameras to screen faces against a watchlist, which has been introduced in many high-street shops and supermarkets in the UK in recent years.

This news surfaced amidst the regulator’s challenges in effectively regulating LFR. While underscoring the growing integration of LFR in commercial and law enforcement sectors, this move raised concerns about oversight independence and accountability. Critics warned that the “revolving door” between regulatory bodies and technology companies could undermine efforts to hold stakeholders accountable for compliance. Furthermore, the current Data Protection and Digital Information Bill proposes to abolish the office of the Biometrics and Surveillance Camera Commissioner. These responsibilities would be transferred to the Investigatory Powers Commissioner, who is responsible for reviewing the use of investigatory powers by public authorities, such as intelligence agencies and police forces.

LFR in society

King’s Cross surveillance: In August 2019, controversy erupted over the extensive facial recognition surveillance deployed at King’s Cross, London. Reports revealed that the private property developer, Argent, had installed CCTV systems with facial recognition technology in the area. The incident prompted investigations by the UK Information Commissioner’s Office (ICO), which was “deeply concerned about the growing use of facial recognition technology in public spaces”. Despite Argent saying that this surveillance was carried out in the interest of public safety, questions were raised about the legality and ethics of conducting mass surveillance.

Beyonce’s concert and police LFR: The police in the UK have also used LFR at gigs and sporting events, such as the use of LFR by South Wales Police at football events, which was later subject to judicial review and an ICO investigation (PL&B UK Report, November 2019, pp. 9-10). The ICO’s view was that, as data protection law applies to LFR, a code should be considered for relevant legal standards when regulating the use of LFR.

In May 2023, concerns arose when it was revealed that Beyoncé’s concert in Cardiff deployed LFR to scan attendees for potential threats. The LFR that was used was placed in Cardiff city centre rather than the stadium where the concert took place, and it compared faces with police-generated watchlists consisting of known terrorists and paedophiles. Liberty, a human rights group, criticised this move, arguing that it reinforces discriminatory policing practices, disproportionately violating the privacy rights of thousands of people.

Facial recognition in retail: July 2023 saw the UK Home Office facing scrutiny over alleged support for LFR usage in retail premises. Critics raised concerns over indiscriminate surveillance and potential data abuse in commercial spaces. They argued that the indiscriminate use of LFR in commercial settings posed significant risks to privacy rights and civil liberties due to the possibility that individuals could be subject to constant, indiscriminate monitoring without their knowledge or consent. Big Brother Watch revealed that LFR cameras installed by a supermarket chain to tackle shoplifting have disproportionately targeted people in poorer areas, directly subjecting these communities to excessive privacy invasions and unfair treatment.

LFR has the potential to be used to identify and remember customers and offer them personalised promotions and recommendations based on their purchase history and  demographic information. By doing so, customer experience can be enhanced through a more streamlined enrolment and efficient process, personalised reward offering, and quicker and easier payment. Nevertheless, it has been found that many LFR algorithms are less adept at recognising certain skin tones, therefore causing racial discrimination and profiling issues.

DP considerations when deploying LFR

LFR may have benefits across a wide range of sectors, provided that it is operated under appropriate safeguards and governance. Implementing appropriate safeguards can also help build public trust and confidence, which is essential to ensuring that the benefits of using LFR can be realised.

The ICO’s Opinion3 of June 2021 on the use of LFR in public places identified several key data protection issues that could arise where LFR is used to collect biometric data in public places. These include a lack of individual choice and control, transparency, potential bias and discrimination, and processing of data regarding children and vulnerable adults etc. Failure to meet these legal requirements could result in potential enforcement actions, including fines, by the ICO. Most recent examples include the ICO’s £7.5 million fine against Clearview, a facial recognition software company. The ICO found that by collecting and storing individuals’ images for matching them to probe images, Clearview had processed UK residents’ personal data without any lawful basis. Although the appeal is still underway, this recent case highlights the importance of compliance with data protection laws when processing biometric data for identification purposes.

Given the sensitivity of facial biometric data collected by LFR, organisations must adhere to stringent data protection requirements when undertaking it. The key requirements include:

1. Lawful, fair, necessary and proportionate processing: organisations must identify a valid lawful basis and a specific, legitimate purpose for using LFR in public and that the use of LFR must be necessary and should be a targeted, proportionate, and effective way to achieve its purpose. As LFR uses biometric data for the purpose of identifying individuals, this is considered special category data under UK data protection laws, which requires additional protections;

2. Conditions for processing special category data and criminal offence data: in addition to identifying a valid lawful basis, organisations must also identify conditions for processing special category data and criminal offence data;

3. Consent and transparency: organisations must ensure that individuals are properly informed before the processing takes place about the collection and use of their biometric data, and explicit consent should be obtained where applicable. However, relying on consent as the legal basis can be challenging in situations where LFR cameras are used to capture images of significant numbers of people as they pass through public places such as shopping centres or transport interchanges;

4. Accuracy and bias mitigation: organisations should ensure the LFR system deployed is sufficiently and statistically accurate for their purposes and they should address the risk of algorithmic biases. Organisations should also ensure the accuracy and fairness of facial recognition systems, particularly concerning specific demographic groups. Human oversight should be involved in some ways; and

5. Accountability: organisations should undertake a data protection impact assessment (DPIA) to consider the risks and potential impacts of such processing on the rights and freedoms of individuals being monitored. The ICO considered that the use of LFR in public places would typically meet at least one of the criteria set out under UK data protection laws requiring organisations to carry out a DPIA. In a DPIA, organisations must ensure that data protection and human rights safeguards are met. An example of this is the South Wales Police’s watchlist in 2019, which included images of minor offence suspects, as opposed to only individuals wanted for violent offences. This was done despite what was set out in their DPIA that each watchlist was constructed in a way that was justified, proportionate and necessary with the nature of the watchlist.
 

Conclusion

The use of LFR is developing in the UK, and we are seeing cases in a wide range of circumstances. There will continue to be an emphasis on the potential benefits from a security perspective, but one must still factor in individuals’ rights when deploying LFR. The collection of mass data to identify specific individuals could otherwise pose a real risk to the general public. As George Orwell wrote in his work 1984: “We know that no one ever seizes power with the intention of relinquishing it”. It is important that policymakers put in place systems that protect the rights and freedoms of individuals when utilising this technology, while the debate about controlling their use continues.

Authors

Katie Hewson is a partner, Daniel Jones is an associate and Nelson Kiu is a senior paralegal at Stephenson Harwood LLP.

© PRIVACY LAWS & BUSINESS March 2024

 

 

1https://committees.parliament.uk/committee/519/justice-and-home-%20affairs-committee/news/199624/lords-committee-questions-legality-%20of-live-facial-recognition-technology/
2https://committees.parliament.uk/publications/43080/documents/214371/default/
3 https://ico.org.uk/media/2619985/ico-opinion-the-use-of-lfr-in-public-places-20210618.pdf