Data Protection update - February 2024

Data Protection update - February 2024

Welcome to the Stephenson Harwood Data Protection bulletin, covering the key developments in data protection law from February 2024.

This month, the UK Government published its response to the results of the consultation on its AI White Paper issued in March 2023. This covers the government's position on future AI legislation and regulation, investments in skills and technology, AI best practices in the public sector and an AI Code of Practice.

In other AI news, the European Commission has established an AI Office, marking a significant milestone in the EU’s journey towards a coordinated AI regulatory framework.

Elsewhere this month, the UK Information Commissioner's Office (the "ICO") published its guidance on the use of biometric data and, on the same day, ordered Serco Leisure to stop deploying biometric monitoring technologies to track the attendance of its leisure centre employees.

In this month's issue:

Data protection

Digital commerce

AI

Cyber security

Enforcement and civil litigation

Data protection

UK cookie enforcement steps up

In November 2023, the ICO wrote to 53 organisations running the UK's top websites (based on active time spent by UK users), applying pressure to update their cookie policies and banners to ensure compliance with data protection law. The ICO assessed the cookie banners on these websites to see if non-essential advertising cookies: (i) were placed before the user can provide consent, (ii) could be rejected as easily as accepted; and (iii) were placed even if the user did not consent. These organisations were given 30 days to take action to ensure that their websites complied with the law. The ICO noted “many of the biggest websites have got this right. We’re giving companies who haven’t managed that yet a clear choice: make the changes now, or face the consequences".

In a recent update, the ICO confirmed that of the 53 organisations contacted, 38 have now changed their cookie banners to be compliant and four have committed to be compliant within the next month.

Looking forward, the ICO plans to accelerate its cookie enforcement efforts by developing an AI tool to identify websites with non-compliant cookie banners. In the meantime, the ICO has stated that they "will not stop with the top 100 websites. We are already preparing to write to the next 100 – and the 100 after that". The ICO hopes this will encourage UK organisations to take action on their websites to ensure compliance. The ICO has warned that "where organisations continue to ignore the law, they can expect to face the consequences".

EDPB launches website auditing tool

At the end of January 2024, the European Data Protection Board (the "EDPB") launched a new tool, enabling users to analyse if websites are compliant with data protection law. The tool is available to be used by legal and technical auditors at data protection authorities, however, it can also be used by controllers and processors more generally to assess the compliance of their own websites.

The EDPB claim that the tool is a solution which is "easy to use in order to facilitate enforcement by national DPAs and compliance checks by controllers". Users will be able to prepare, carry out and evaluate website auditions within the tool itself, and then generate a report showing necessary updates.

Those that wish to access and download the tool can do so here.

ICO issues guidance on biometric recognition systems

This month, the ICO published guidance on how data protection law applies in relation to the use of biometric recognition systems by organisations.

The guidance considers:

  • what biometric data is;
  • when it is considered special category data;
  • its use in biometric recognition systems; and
  • the data protection requirements organisations must comply with.

The guidance uses a "must", "should" and "could" classification system to identify and distinguish legal requirements from good practice. Topics covered include: processing biometric data lawfully and fairly; how the accuracy principle applies to the data; how to keep processing transparent; how to consider rights requests for biometric data; and how to keep data secure.

The guidance does not cover the use of biometric classification or categorisation systems, as the ICO intends to publish separate guidance on these topics by the end of 2024.

On the same day as the ICO released this guidance, the ICO also ordered Serco Leisure to stop deploying and using monitoring technologies such as facial recognition and fingerprint scanning to track the work attendance of its leisure centre employees (please see our summary below.

ICO approves legal services certification scheme

The ICO has approved a new certification scheme, which legal services providers can choose to make use of when processing client personal data.

This is the fifth set of voluntary, sector specific, UK GDPR certification criteria that the ICO has approved under Article 42 of the UK GDPR. It is aimed at helping legal service providers (both processors and controllers) show compliance with data protection requirements, with the aim of inspiring trust and confidence in the legal industry. The scheme will apply to the whole legal industry and will relate to the processing of personal data held in client files.

Digital commerce

Smaller businesses face compliance with the Digital Services Act

With its first tranche of obligations coming into force in November 2022 (see our commentary on this here), the second-phase of the EU Digital Services Act (the "DSA") has now come into effect from 17 February 2024. The DSA aims to "create a safer digital space in which the fundamental rights of all users of digital services are protected". Digital services include a large category of online services, from simple websites to internet infrastructure services and online platforms.

The first phase of the DSA obligations affected so-called "Very Large Online Platforms" and "Very Large Online Search Engines" (those with more than 45 million users in the EU). The second phase of DSA obligations now in force apply to all online platforms with users in the EU, hosting services, and online intermediates (such as internet service providers). There are exemptions for businesses with fewer than 50 staff and with an annual turnover below €10 million.

Under the second tranche of obligations - summarised by the European Commission press release here - all online platforms with users in the EU (aside from any exempt businesses) will be required to implement measures to:

  • Counter illegal content, goods, and services: online platforms must provide users with means to flag illegal content, including goods and services. More so, online platforms will have to cooperate with ‘trusted flaggers', specialised entities whose notices will have to be given priority by platforms.
  • Protect minors: including a complete ban of targeting minors with ads based on profiling or on their personal data.
  • Empower users with information about advertisements they see, such as why the ads are being shown to them and on who paid for the advertisement.
  • Ban advertisements that target users based on sensitive data, such as political or religious beliefs, sexual preferences, etc.
  • Provide statements of reasons to a user affected by any content moderation decision, e.g., content removal, account suspension, etc. and upload the statement of reasons to the DSA Transparency database.
  • Provide users with access to a complaint mechanism to challenge content moderation decisions.
  • Publish a report of their content moderation procedures at least once per year.
  • Provide the user with clear terms and conditions, and include the main parameters based on which their content recommender systems work.
  • Designate a point of contact for authorities, as well as users.

Hosting services and intermediary services will not be obligated to comply with all of the above. For a summary table of the full obligations applicable to very large platforms, online platforms, hosting services and online intermediaries please see here.

Violations could result in fines of up to 6% of annual global turnover.

Unlike the larger platforms, these smaller entities will have obligations enforced by member state regulators as opposed to the European Commission. Businesses will therefore be required to declare their EU headquarters and designate a representative in that territory. To oversee the consistent implementation of the DSA across the EU, the European Board for Digital Services has been established.

AI

UK DSIT publishes response to AI White Paper consultation

On 6 February 2024, the Department for Science, Innovation & Technology ("DSIT") published its response to the results of the consultation on its AI White Paper issued in March 2023. The response includes various announcements and initiatives as a result of feedback received by DSIT during the consultation.

A summary of the response is as follows:

Future Legislation Considerations

In response to concerns about whether new AI-specific legislation is required, the government acknowledged the need to address the challenges posed by General Purpose AI ("GPAI") systems, including potential future legislation to ensure their safe use and developer accountability. More immediate legislative plans include expanding lawful bases for automated decision-making under the Data Protection and Digital Information (No 2) Bill. The plan is still for existing sectoral regulators to lead on AI enforcement activity in the UK.

AI Regulation Roadmap

DSIT outlines its roadmap for 2024, which includes collaborating with industry stakeholders and regulators to develop AI regulatory policy across a range of sectors, addressing AI risks, and providing guidance and support for AI adoption across industries. International collaboration on AI governance will also be prioritised. This is in response to regulators suggesting that collaboration between regulators, safety engineers, and AI experts is key to creating robust verification measures that prevent, reduce, and mitigate risks.

Establishment of a Central Function for Regulatory Coordination

DSIT has established a central function within government to oversee regulatory activity, promote information sharing, and monitor risks in response to feedback indicating strong support for such a body to co-ordinate regulatory activity and prevent gaps in regulatory guidance. Regulators, including Ofcom, the ICO, and the FCA, are mandated to outline their approach to AI regulation by 30 April 2024.

Investment in Skills and Technology

Regulators reported varying levels of in-house AI knowledge and capability, supporting central measures to enhance technical expertise. In response, DSIT has announced significant investments aimed at enhancing technical expertise and supporting AI innovation. This includes allocating £100 million to train regulators in AI-related skills through a collaboration with UK Research and Innovation.

Expanding AI Best Practices in the Public Sector

The government intends to make the Algorithmic Transparency Recording Standard (ATRS) mandatory for all government departments, with plans to gradually expand its usage across the broader public sector. This move aims to ensure transparency and accountability in the use of AI algorithms within government operations.

No Copyright and AI Code of Practice

The Intellectual Property Office (IPO) working group on AI and copyright failed to reach a consensus on a voluntary code of practice. Respondents emphasised the need for better AI transparency to redress the issue of intellectual property infringement; the government will explore alternative approaches to foster collaboration between AI developers and copyright holders, emphasising trust and transparency in the process.

 

The government's response demonstrates a desire to balance innovation with regulatory guardrails, to foster responsible AI development and adoption in the UK.

European Commission’s AI Office established

The European Commission’s has established an AI Office, with effect from 21 February 2024. The establishment of the AI Office marks a significant milestone in the EU’s journey towards a coordinated AI regulatory framework which balances innovation with accountability.

Here is what you should know:

1. Mandate and responsibilities

The AI Office will serve as the central coordination body for AI policy at the EU level, implementing and enforcing the upcoming EU AI Act. It will collaborate with various stakeholders, including European Commission departments, EU bodies, Member States, and the broader stakeholder community, to promote the EU approach to AI governance and foster innovation.

2. Enforcing the AI Act

As the enforcer of the AI Act, the AI Office will supervise and investigate providers of GPAI models, ensuring compliance with the stringent regulations outlined in the legislation.

3. Governance structures

The AI Office will work in tandem with the European Artificial Intelligence Board, a Scientific Panel of Independent Experts, and an Advisory Forum to guide the implementation of the AI Act, integrate scientific insights, and facilitate stakeholder engagement.

4. Codes of practice

Central to the EU’s governance framework are codes of practice aimed at enhancing compliance with the AI Act. The AI Office will play a supporting role in the preparation of secondary legislation, guidance, standards and codes of practice to facilitate the uniform application of the AI Act.

5. Supervision of GPAI models

Given the risks associated with GPAI AI models, the AI Office will play a crucial role in ensuring their compliance, monitoring, and governance within the EU’s regulatory framework. It will collaborate with stakeholders to develop codes of practice tailored to GPAI systems and engage in joint investigations to promote compliance.

6. Regulatory collaboration

Collaboration will be key to the AI Office’s operations, with a focus on engaging stakeholders from diverse backgrounds, including AI developers, regulatory bodies, civil society, and academia.

7. Market monitoring

The AI Office will monitor the evolution of AI markets and technologies, developing tools for evaluating AI models and identifying potential infringements. It will work closely with national market surveillance authorities to ensure compliance and address unforeseen risks, safeguarding the public interest.

ASEAN releases its long-awaited AI governance and ethics guide

The Association of Southeast Asian Nations ("ASEAN") released its highly anticipated ASEAN Guide on AI Governance and Ethics on 2 February 2024, offering recommendations for how businesses and governments should address AI (the "ASEAN Guide"). Developed collaboratively by ASEAN member nations, the guide is designed to foster alignment in AI approaches across the region.

The ASEAN Guide outlines seven guiding principles for AI development: transparency and explainability; fairness and equity; security and safety; “human-centricity”; privacy and data governance; accountability and integrity; and robustness and reliability. The ASEAN Guide also highlights the importance of national coordination, proposing the establishment of an ASEAN working group on AI to facilitate intergovernmental cooperation.

While the ASEAN Guide largely mirrors international AI governance principles, it deviates from other international stances by not prohibiting specific AI uses. Instead, the ASEAN Guide emphasises heightened human involvement in high-risk scenarios.

Cyber security

US sanctions affiliates of LockBit ransomware group responsible for disrupting the US Treasury market

Russia-based ransomware group LockBit has been implicated in multiple ransomware attacks targeting critical infrastructure such as hospitals, schools, and financial institutions. Notably, the group was responsible for a ransomware attack against the Industrial and Commercial Bank of China's US broker-dealer in November 2023; please see our summary of this incident here. In early 2023, Britain’s Royal Mail also faced severe disruption after an attack by the group.

In a recent update, the US has now imposed sanctions on affiliates associated with the LockBit following an international law enforcement operation, "Operation Cronos", led by Britain's National Crime Agency and the FBI. The operation was an international coalition of 10 countries.

The agencies have arrested, indicted or sanctioned some of the perpetrators and they have also seized control of websites used by LockBit. As a result of these sanctions, all property and interests owned by the designated individuals within US jurisdiction are blocked, and transactions involving them are prohibited without authorisation from the Office of Foreign Assets Control (OFAC). The ultimate goal of these sanctions is to induce positive behavioural changes and enhance cybersecurity measures.

Enforcement and civil litigation

Serco Leisure ordered by ICO to stop biometric data processing in the workplace

On the same day as the ICO published the finalised version of its guidance on the use of biometric data (summary above), the ICO also ordered Serco Leisure to stop deploying and using monitoring technologies such as facial recognition and fingerprint scanning to track the work attendance of its leisure centre employees. This is the first reported intervention by the ICO in employee monitoring and processing of employee biometric data.

This decision came as a result of the ICO's investigation which found that Serco Leisure had been unlawfully processing the biometric data of more than 2000 employees to check on their attendance and to determine their wages. Serco Leisure failed to demonstrate why their actions were necessary or proportionate and were held to be "prioritising business interests over its employees’ privacy”. It was argued that less invasive methods of technology, for example ID cards or fobs, could have achieved the same result.

By collecting biometric data without giving leisure centre staff an alternative, consent to such processing was effectively presented as a requirement in order to remain employed. The power imbalance between employer and employee exacerbated the enforced nature of the processing, as staff would not have felt comfortable in refusing Serco Leisure access to their data.

The ICO has ordered Serco Leisure to stop all such biometric processing and destroy all existing biometric data it holds within three months. Serco Leisure said it will take steps to fully comply with the enforcement notice.

EU opens formal investigation into TikTok breaches

On 19 February 2024, the EU announced it will investigate TikTok's alleged breaches of the DSA (please see an update on this legislation above), aimed at protecting children and ensuring transparent advertising.

In particular, the investigation will consider addictive design & screen limits, algorithmic recommendations, age verification and default privacy settings. The focus will be on the design of TikTok's system and its algorithms which may stimulate behavioural addictions. The European Commission will assess the appropriate and proportionate nature of TikTok's security measures and if TikTok provides a reliable databased on advertisements.

If found in breach, TikTok could face fines of up to 6% of its global turnover. TikTok said it will continue to cooperate to ensure young people on its platform are kept safe. No deadline has been set for the investigation.

ECHR rules against Russian data requirements for law enforcement purposes

On 13 February 2024, the European Court of Human Rights ("ECHR") ruled that Russian legislation requiring communication services to store all user data and create encryption backdoors for law enforcement purposes was incompatible with democratic principles.

This decision arose from a case involving the messaging service Telegram, which was ordered by Russian authorities in 2017 to provide decryption assistance for users accused of terrorism-related activities. Notably, Russia's withdrawal from the European Convention on Human Rights did not absolve the ECHR from having jurisdiction over this case, as the events predated Russia's withdrawal in September 2022.

The ECHR recognised the privacy benefits of encryption, emphasising its role in protecting users from online threats. However, it concluded that decryption measures, as mandated by Russian law, would undermine privacy rights and compromise the security of electronic communications for all users. The ECHR further ruled that the storage and retention of user communications data violated the right to privacy, irrespective of whether the retained data was then accessed by the authorities.

This landmark ruling is expected to impact current and future policies in Europe seeking to regulate encryption mechanisms. Civil society organisations and industry experts anticipate changes in EU legislation, emphasising the importance of steering clear of proposals that compromise fundamental rights in the digital age.

Round-up of enforcement actions

Company

Authority

Fine

Comment

ICO

ICO

No fine

The ICO issued an enforcement notice against itself following a complaint made by an individual for failing to respond to their FOI request within 20 working days, thereby breaching FOIA section 10(1).

Uber

Denmark DPA

€10 million

Uber was fined for failing to make the online form used to request access to personal data sufficiently accessible, providing that data in an incorrect format and providing insufficient information in their privacy statement regarding data retention, transfers, and portability rights, thus violating EU GDPR regulations.

Morele.net

Poland DPA

€879,000

The e-commerce site was fined for failing to implement adequate security measures, resulting in the unauthorised access and leak of 2.2 million people's data.

Dvora Beauty Center

Italy DPA

€8,000

Following a complaint regarding the dissemination of a video of a patient in the absence of a valid legal basis, the company was fined as they failed to obtain the patient's consent.

Meta

German court

No fine

Meta's use of language on its pay-or-consent buttons used on Facebook and Instagram's platforms, such as "subscribe" and "continue to payment", was found to violate German consumer law as it lacked clear language indicating a commitment to pay.