Data Protection update - April 2024
Welcome to the Stephenson Harwood Data Protection bulletin, covering the key developments in data protection law from April 2024.
This month, the EDPB published its Opinion on "consent or pay" models, the ICO issued new health and social care guidance as well its Children's Code Strategy, and a discussion draft of the proposed American Privacy Rights Act of 2024 has been published.
In other news, the CNIL has published recommendations on AI system development, whilst in the UK, the Department for Science, Innovation and Technology ("DSIT"), has published guidance on the responsible use of AI in recruitment. The UK also signed a Memorandum of Understanding on AI safety with the US and the California Assembly passed a bill to require generative AI-created content to be watermarked.
Also this month, the UK has proposed a new Cyber Governance Code of Practice, the UK PSTIA Regulations came into force and the Irish DPC's appeal against the EDPB's rulings relating to Meta Platforms Ireland Limited proceeded to hearing stage at the EU General Court.
In this month's issue:
Data protection
- EDPB publishes Opinion on "consent or pay"
- New ICO health and social care guidance and children's strategy
- Discussion draft of US privacy legislation issued
AI
- New CNIL recommendations and DSIT guidance on AI
- UK and US sign a Memorandum of Understanding on AI safety
- California Assembly passes AI watermarking bill
Cybersecurity
Enforcement and civil litigation
- Irish DPC appeals against EDPB rulings
- DSA: VLOP designations and disputes
- ICO joins Global Cooperation Arrangement for Privacy Enforcement
- Round-up of enforcement actions
Data protection
EDPB publishes Opinion on "consent or pay"
On 17 April 2024, the European Data Protection Board ("EDPB") released its Opinion on the use of "consent or pay" models adopted, in particular, by large online platforms (the "Opinion"). A "consent or pay" model is one where users are provided with two options before they can use a platform: (i) consent to the processing of their personal data for behavioural advertising purposes; or (ii) pay a fee to avoid such processing. The Opinion focuses on whether consent to process personal data for the purposes of behavioural advertising in this context is valid. The Opinion comes after it was requested by the Dutch, Norwegian and Hamburg data protection authorities (each a "DPA") following Meta's launch of the model on Facebook and Instagram. The Opinion will be of particular interest to companies designated as very large online platforms under the Digital Services Act and gatekeepers under the Digital Market Act, as well as businesses making use of their behavioural advertising services or relying on incentive-based consents.
In the Opinion, the EDPB establishes that when large online platforms use this type of model, it is unlikely to constitute valid consent, if the platform confronts users only with a choice between consenting to the processing of personal data for behavioural advertising purposes and paying a fee. This is because the EDPB is of the opinion that platforms would struggle to demonstrate that the consent is freely given and suggests that large online platforms need to offer an "equivalent alternative" that is free and does not involve processing personal data for behavioural advertising. Conditionality, detriment, power imbalance, and granularity should be considered in an assessment of whether consent is freely given. Any fees charged must not coerce consent, and negative consequences such as exclusion from key services must be avoided.
The EDPB also stressed that obtaining users' consent does not exempt platforms from the need to adhere to GDPR principles including purpose limitation, data minimisation, fairness, necessity and proportionality.
The Opinion is non-blinding, but DPAs can take the Opinion into account when deciding the result of their investigations on Meta.
In parallel, in March, the ICO launched a consultation on "consent or pay" models in the UK. Please see our article for more information on this consultation.
New ICO health and social care guidance and children's strategy
1. Health and social care sector:
On 15 April 2024, the ICO published guidance on transparency in health and social care. This follows the ICO's consultation on the topic, concluded in January 2024.
The guidance includes recommendations on:
- how an organisation can demonstrate that it is being open and honest;
- involving patients and the public when developing and evaluating transparency materials such as privacy notices;
- providing transparency and privacy information effectively and in a clear and accessible manner, including providing examples of potential harms that can arise from a lack of transparency; and
- how an organisation can assess whether it is being transparent.
2. Protecting children's privacy online:
On 3 April 2024, the ICO published its Children's Code Strategy (the "Strategy"), which sets out key areas of improvement for social media and video-sharing platforms to ensure that they keep children's personal information safe, as well as how it will continue to enforce the UK Children's Code, introduced in 2021.
In relation to social media and video-sharing platforms, the ICO's key areas of focus are:
- privacy and geolocation settings turned off by default;
- profiling children for targeted advertisements (this should be turned off by default);
- use of children's information in recommender systems (algorithmically-generated content feeds that risk exposing children to unsuitable content); and
- use of information of children under 13 years old (focusing on how services can obtain parental consent and use age assurance technologies).
To implement the Strategy, the ICO plans to publish a call for evidence in summer 2024, to gain input from a range of stakeholders.
Discussion draft of US privacy legislation issued
On 7 April 2024, the chairs of the US House and Senate commerce committees released a discussion draft of the proposed American Privacy Rights Act of 2024 ("APRA"). This federal US privacy legislation would establish, amongst other things, enforceable national data protection rights.
Under APRA, Americans would have the right to control their own personal data and to opt out of targeted advertising and their personal data being used in algorithms that make decisions about housing, employment, healthcare, credit opportunities, education, insurance, or access to public accommodation.
The US Federal Trade Commission and state Attorney Generals would enforce the legislation and individuals would also have a private right of action. Individuals bringing claims under the legislation would be able to recover damages, injunctive relief, declaratory relief and legal costs.
This is not the first time a Federal bill on privacy rights has been proposed, with the last version (the American Data Privacy Protection Act) failing to advance to the House or Senate in the last Congress.
We understand from our US peers that APRA is unlikely to be passed in its current form.
AI
New CNIL recommendations and DSIT guidance on AI
Following a public consultation, on 8 April 2024, the French data protection authority, the CNIL, published recommendations on how the GDPR applies to the development of AI systems.
CNIL's recommendations include guidance on how to determine:
- which data protection laws would apply to the AI system;
- the data protection law roles of people involved in the AI system(including joint controllership);
- the legal basis for processing and purpose limitation in the context of AI systems; and
- when and how a data protection impact assessment would be required.
These recommendations should make it clearer to companies involved in the development of AI systems (particularly those working with generative AI) how they can ensure that their activities remain compliant with GDPR. The CNIL has stated that in the next few months, it will launch another public consultation so it can provide further guidance to supplement these recommendations.
The interaction of AI and data protection legislation is a topic of concern not only in France. On 25 March 2024, DSIT published guidance on using AI responsibly in recruitment processes.
DSIT's guidance includes information on:
- the need for the use of AI to adhere to the five regulatory principles identified in the Government's white paper;
- the risks associated with certain AI tools used in recruitment processes;
- assurance mechanisms that should be used across each stage of the procurement and deployment process; and
- considerations that organisations should take into account when procuring and deploying AI systems.
In recent months, the ICO has also launched multiple calls for evidence on the topic of generative AI. Please see our blog post for further details.
UK and US sign a Memorandum of Understanding on AI safety
In November 2023, at the AI Safety Summit (the "Summit"), the US and the UK governments announced the creation of their respective AI Safety Institutes. Please see our article on the Summit here.
On 1 April 2024, the UK and the US signed a Memorandum of Understanding ("MOU") on AI safety. The MOU provides more detail on how the two countries will work together to follow through on commitments they made at the Summit. The goal is for the countries to collaborate to develop a shared approach to testing advanced AI models and to share information about the risks and capabilities associated with AI systems.
California Assembly passes AI watermarking bill
The California Assembly's Privacy and Consumer Protection Committee has passed a bill requiring generative AI-created content to be watermarked (the "Bill"). If the Bill is enacted, any breaches could lead to fines of up to $1 million or 5% of annual global revenue, whichever is greater.
Many industry groups oppose the Bill, highlighting that watermarking technology is too unreliable to be deployed in this way. The next stage is for the bill to be passed by the Assembly. Many other states are considering similar measures.
Cybersecurity
UK PSTIA Regulations come into force on 29 April 2024
On 29 April 2024, the Product Security and Telecommunications Infrastructure (Security Requirements for Relevant Connectable Products) Regulations 2023 (the "Security Regulations") came into effect. The Security Regulations are part of the new security regime imposed by the UK Product Security and Telecoms Infrastructure Act 2022 ("PSTIA").
The Security Regulations apply to relevant connectable products, namely internet-connectable products or network-connectable products, with a number of exceptions.
The Security Regulations place obligations on the manufacturers of relevant connectable products on sale in the UK to take steps including:
- implementing robust passwords;
- providing information on how to report security issues;
- providing information on minimum security updates; and
- publishing a statement of compliance to accompany the product.
Please see our blog post for further details.
Enforcement and civil litigation
Irish DPC appeals against EDPB rulings
The dispute between the Irish Data Protection Commission ("DPC") and the EDPB concerning the EDPB's binding decisions on Meta Platforms Ireland Limited (specifically, 3/2022, 4/2022 and 5/2022) (the " Binding Decisions") has proceeded to the hearing stage at the EU General Court. Please see our article for further context.
In the Binding Decisions, the EDPB directed the DPC to conduct fresh investigations on whether Facebook and Instagram processed sensitive data for targeting advertisements, and whether WhatsApp processed data for behavioural advertising purposes and shared data with third parties.
The DPC lodged applications to the EU General Court for the annulment of part of the Binding Decisions. The DPC argued that the EDPB did not have the legal power to order it to open new investigations. Such order would undermine the lead supervisory authority’s authority and efficacy, and create real risks for procedural fairness, which could lead to successful appeals of the DPC’s decisions.
The EDPB argued that it had the power to include anything in its final order if it related to an objection by a supervisory authority to the findings of another supervisory authority. In this case, the EDPB received objections from other supervisory authorities that the DPC should have widened the scope of its investigations to include special category data.
The EDPB also argued that the DPC’s interpretation would give sole power to the lead authority (in this case Ireland where most large US technology companies’ headquarters are based) to decide the scope of any investigations, which is not the intended approach to GDPR enforcement.
The eventual decision will be of interest to those conducting cross-border processing in the EU, as it will clarify the scope of the EDPB's authority.
DSA: VLOP designations and disputes
In April 2023, the European Commission (the "Commission") published the names of the first batch of companies designated as very large online platforms ("VLOPs") for the purposes of the Digital Services Act ("DSA"). VLOPs are online platforms which have more than 45 million users per month and are required to meet a number of additional obligations, including:
- carrying out risk assessments on any systemic risks associated with the VLOPs service (Article 34);
- introducing measures to mitigate the risks identified in the risk assessments (Article 35);
- providing easy to read and multilingual versions of their terms and conditions (Article 14(5)); and
- making a searchable repository about adverts displayed on their platforms publicly available (the "Information") (Article 39).
Amazon has lodged an appeal to the Court of Justice of the European Union (the "CJEU") against its designation as a VLOP and had obtained an interim order from the General Court to suspend disclosure of the Information (the "Interim Order"), pending the result of such appeal.
However, in a recent decision, the CJEU ordered to set aside the Interim Order. The CJEU recognised that some of the Information is confidential and may amount to business secrets but ultimately decided that the interest of the Commission in the full implementation of the DSA outweighed Amazon's business interest, and ordered the Interim Order to be set aside.
The Commission has also designated German online retailer Zalando as a VLOP for the purposes of the DSA. With the VLOP designation, the Commission charges providers of VLOPs annual supervisory fees (the "Fees") to cover the costs of the Commission performing supervisory tasks under the DSA. The annual Fee is capped at 0.05% of their annual global income.
Zalando has recently filed a fresh legal dispute at the General Court, concerning the methodology through which the Fee is calculated. TikTok and Meta have also taken actions relating to the Fee calculation.
ICO joins Global Cooperation Arrangement for Privacy Enforcement
On 4 April 2024, the ICO announced it has signed an international multilateral agreement with the Global Cooperation Arrangement for Privacy Enforcement ("Global CAPE") to cooperate in cross-border data protection and privacy enforcement. The ICO can share information and collaborate on investigations with Global CAPE members.
Global CAPE members include the United States, Australia, Canada, Mexico, Japan, the Republic of Korea, the Philippines, Singapore, and Chinese Taipei.
Round-up of enforcement actions
Company | Authority | Fine | Comment |
I-de Redes Electricas Inteligentes, S.A.U. | Spanish DPA | €3 million + €3.5 million | The company was fined €3.5 million for a data breach incident affecting 1.35 million customers. The company was also fined €3 million for not implementing sufficient technical and organisational measures. |
CaixaBank | Spanish DPA | €1,200,000 | The bank was fined for insufficient consent. The data subject had no option to reject a clause in the bank's documents giving consent to the bank to access their data from the General Treasury of Social Security. Refusal could have led to account closure. |
HUBSIDE.STORE | French DPA | €525,000 | HUBSIDE.STORE was fined for conducting phone and SMS marketing campaigns without a legal basis for processing data and not complying with the obligation to inform individuals. |
Santander Bank Polska S.A. | Polish DPA | PLN 1,440,000 (€330,000) | The bank was fined for failing to report a data breach. The DPA learned about the data breach from the media that a stolen parcel containing personal data was abandoned. |
Laziocrea | Italian DPA | €271,000 | Laziocrea suffered a ransomware attack, and a subsequent investigation revealed several GDPR violations, including failure to implement sufficient security measures and notify the DPA on time. |
Ministry of Immigration and Asylum | Greek DPA | €175,000 | The Ministry was fined for breaches in relation to its lack of cooperation with the Greek DPA and incomplete DPIA concerning electronic and physical systems and entry-exit systems using fingerprint readers. |