Data Protection update – January 2025
Welcome to our traditional bumper double edition of the Stephenson Harwood Data Protection update, covering the key developments in data protection and cyber security law from December and January.
In a busy period for data protection developments, the ICO has given a caveated green light for organisations to use "consent or pay" models for targeted advertising, is consulting on draft guidance on storage and access technologies, and has also said it will press ahead with its "public sector approach" following the conclusion of a two-year trial period; the EDPB has issued an opinion on the processing of personal data in the context of AI development, has published position papers on data protection in a competition law context and on pseudonymisation (the latter of these being subject to consultation), and has published the results of its year-long coordinated enforcement effort on the GDPR right of access; the first amendments to the DUA Bill, currently making its way through Parliament, have been approved; Ofcom has published its finalised age-assurance guidance; and DSIT has published the final details of increases to data protection fees payable by organisations.
In cyber security news, the UK Home Office and National Cyber Security Centre ("NCSC") are jointly consulting on a range of proposals aimed at reducing the level of ransomware payments made by UK organisations, both public and private sector; joint guidance has been published by the UK NCSC and various other international cyber security authorities on including "secure by design" principles in "operational technology" procurement processes for critical infrastructure organisations; and the US Treasury has said that it was hacked by China in what it has described as a "major incident".
In enforcement and civil litigation news, the European Commission has been found to have breached the GDPR in transferring personal data of its website users to the US, as well as separately when serving targeted advertising to social media users; the Court of Appeal has dismissed an attempt to bring an opt-out class action against Google for misuse of private information; the High Court has ruled that a gambling addict did not, and could not, properly and freely consent to collection of his personal data via cookies placed by a gambling platform; the CJEU has published two rulings clarifying certain aspects of GDPR including the meaning of "excessive" subject access requests; and the High Court has ruled against HMRC in a case brought by Mike Ashley regarding a subject access request which was improperly responded to as, the Court found, HMRC had used an improperly narrow conception of what constituted personal data.
Event invite: Ransomware in action: live attack demo and legal troubleshooting
26 February – 5:00 PM to 7:30 PM - London
We are delighted to invite you to an event jointly hosted by Stephenson Harwood and CYPFER.
The event will involve a live simulation of a ransomware attack, with a real time blow-by-blow discussion of the legal issues that arise at each step of the process. Observe as we demonstrate how a hacker infiltrates an organisation's network and deploys malicious software to encrypt critical data.
This session will shed light on the inner workings of ransomware attacks, showcasing the tactics, techniques, and procedures used by cybercriminals. Gain valuable insights into the immediate and long-term impacts of such attacks on businesses, including operational disruption, financial loss, reputational damage and litigation risk. Enhance your understanding of the critical steps to take during an attack and explore the legal ramifications that organisations can face in the aftermath.
We hope you can join us for this exciting event, led by Joanne Elieli (Partner and Cyber Lead) and Katie Hewson (Partner and Head of Data Protection) from Stephenson Harwood, and Kadir Levent (Managing Partner) and Craig McKay-Kilfedder (Associate Manager, Consulting) from CYPFER.
Register here.
Data protection
- ICO gives qualified go-ahead for "consent or pay" models
- European Data Protection Board issues opinion on using personal data in AI development
- ICO criticises Google for change in policy on 'fingerprinting' and consults on updated guidance for storage and access technologies
- Peers approve a trio of amendments to the Data (Use and Access) Bill
- EDPB consults on pseudonymisation guidance and issues opinion on the interplay between data protection and competition law
- EDPB reports on the outcomes of its right-of-access Coordinated Enforcement Framework
- Ofcom publishes age-assurance implementation guidance
- ICO presses ahead with "public sector approach"
- UK DSIT announces revised data protection fees
Cyber security
- Home Office, NCSC consult on proposals to reduce the threat of ransomware
- International cybersecurity authorities publish joint guidance for organisations on secure operational technology products
- 'Major incident' sees US Treasury officials accuse China of cyber attack
Enforcement and civil litigation
- European Commission ruled to have breached GDPR in US personal data transfers, ordered to pay damages
- Commission reprimanded for breaching data protection obligations in targeted advertising campaign
- Court of Appeal upholds High Court ruling dismissing class action against Google for misuse of private information
- High Court rules that problem gambler did not, and could not, freely consent to processing of personal data by gambling platform
- High Court rules against HMRC in case with wider importance for what is considered "personal data"
- CJEU clarifies that intent is key when assessing whether a subject access request is "excessive"
- General Court finds EDPB was entitled to order Irish DPC to conduct additional investigations into Meta
- Meta fined €251 million by Irish DPC over Facebook data breach in 2018
- OpenAI fined €15 million by Garante over data protection failings, says it will appeal
- CJEU rules that French national rail company breached GDPR by collecting gender data for online ticket purchases
- Noyb brings complaints against TikTok and five other Chinese tech firms, alleging unlawful transfers of Europeans' personal data to China
- Round-up of enforcement actions
Key US updates
- US Department of Justice issues final rules restricting access to American personal data by countries posing national security concerns
- Apple settles US litigation over third-party sharing of "Siri" voice recordings for $95 million
- US State privacy laws update
Data protection
ICO gives qualified go-ahead for "consent or pay" models
The ICO has said, in guidance published in January, that "consent or pay" models for using personal data to target online advertising are not inherently noncompliant with the UK data protection law, and has set out factors that organisations will need to consider when assessing whether a model is compliant.
"Consent or pay" is an increasingly popular term for allowing or preventing access to online products and services. A website or app will present visitors with the option to either: a) consent to the use of their personal data to target advertising to them, in which case access to the relevant website, product or service is free of charge; or b) withhold consent for the use of their personal data for these purposes, in which case the user is then required to pay a fee to access the website, product or service.
In its guidance, published following a call for evidence issued in March 2024, the ICO has concluded that these "consent or pay" models "can be compliant with data protection law if you can demonstrate that people can freely give their consent and the models meet the other requirements set out in the law". Factors that will be relevant in determining whether individuals can freely give, or withhold, their consent include:
- Power imbalance between the organisation and the users of its product or service;
- Whether the fee has been set at an appropriate level;
- Whether the core service provided by the organisation is, ultimately, "broadly equivalent" irrespective of whether a user has opted for the "consent" option or the "pay" option; and
- Whether the choices are presented equally to users and presented in a clear and comprehensible manner – so-called "privacy by design".
The ICO draws a distinction between "consent or pay" models and so-called "take it or leave it" models. A "Take it or leave it" model is one where a user must consent to the collection and use of their personal data before accessing a website or other online product or service (and access is outright refused if consent is withheld). These are not compliant with data protection law in most cases.
The guidance stipulates that organisations carrying out an assessment of whether individuals can freely give or withhold consent must document the assessment and be prepared to justify how their model is compliant.
European Data Protection Board issues opinion on using personal data in AI development
The European Data Protection Board ("EDPB") has issued a highly anticipated opinion on the data protection implications of using personal data to train and deploy AI models.
Overall, the EDPB's opinion concludes that using "legitimate interests" as a lawful basis for processing personal data for these purposes is not precluded. A controller will need to carry out the usual three-part legitimate interests test on a case-by-case basis. In circumstances in which the relevant legitimate interest of the controller is "lawful", '"clearly and precisely articulated", and "real and present, not speculative", it is likely that this lawful processing basis can be validly relied on.
The EDPB also takes the view that even where personal data has been unlawfully processed in the context of developing an AI model, this does not necessarily mean that the deployment of that same model is also unlawful. The assessment will depend upon the extent to which the processing activities in development and deployment can be considered as being carried out for separate purposes.
We published a fuller review of the key points from the EDPB's opinion in our January 2025 edition of Neural Network, our newsletter reporting on developments in the world of AI, which you can read here. Meanwhile, the EDPB's opinion is available to read in full here.
ICO criticises Google for change in policy on 'fingerprinting' and consults on updated guidance for storage and access technologies
The ICO has released a statement in response to Google's recent change in policy to remove prohibitions on organisations employing "fingerprinting" techniques when using Google's advertising products.
Fingerprinting is where information about a device's software or hardware is collected, with a combination of that collected data making it possible to identify a particular device or user. In its statement, the ICO made clear that it does not view fingerprinting as a fair method of tracking users online as it is "likely to reduce people's choice and control over how their information is collected". The regulator has branded Google's policy change as "irresponsible" and noted that in 2019 Google itself stated that fingerprinting "subverts user choice and is wrong".
The ICO has said that it will continue to engage with Google regarding this policy change and has more generally reminded businesses of their obligations regarding fingerprinting and privacy – noting, for example, that "data protection law, including the Privacy and Electronic Communications Regulations … applies. Businesses must give users fair choices over whether to be tracked before using fingerprinting technology, including obtaining consent from their users where necessary".
On the same day that this statement was published, the ICO published draft guidance on how data protection law applies to storage and access technologies such as fingerprinting, and opened a consultation, which will run until 14 March 2025, allowing organisations the chance to provide feedback on the draft guidance. The draft guidance makes it clear that a wide range of tracking technologies come within the scope of the requirement to obtain prior consent, including various "cookieless" alternatives.
Peers approve a trio of amendments to the Data (Use and Access) Bill
A series of amendments to the UK Government's draft Data (Use and Access) Bill (the "DUA Bill") have been approved by the House of Lords.
The amendments will:
- Require processing to satisfy a "public interest test" before it can benefit from the DUA Bill's provisions that afford less onerous data subject consent requirements when personal data is being processed for scientific research;
- Require authorities using digital verification services to ensure the accuracy of the information they hold; and
- Require the government to provide cybersecurity guidance to organisations seeking to access the new National Underground Access Register (which the Bill will create) before access is allowed.
We have published the first two articles in a series examining the DUA Bill's key provisions in detail – with more to follow – which you can find here.
EDPB consults on pseudonymisation guidance and issues opinion on the interplay between data protection and competition law
The European Data Protection Board ("EDPB") has published a report on controllers' implementation of the GDPR right of access, resulting from coordinated investigations by 30 national data protection authorities ("DPAs") across Europe. In total, 1,185 controllers, from varying sectors and of sizes ranging from SMEs to large companies, responded to the fact-finding exercise that formed the initial stage of the EDPB's Coordinated Enforcement Framework.
Two thirds of participating DPAs evaluated the responding controllers' level of compliance with the right of access as being between "average" and "high". Larger controllers and those receiving more subject access requests were more likely to have reached a higher level of compliance than smaller organisations. Other positive findings included the implementation of best practices by controllers such as the use of user-friendly online forms that allow individuals to submit an access request easily.
The report also identifies seven challenges facing greater compliance with the right of access. These challenges include a lack of documented internal procedures to handle access requests and the barriers that individuals may encounter when exercising their right to access. The report provides a list of non-binding recommendations to be taken into account by controllers and DPAs.
Following this, the EDPB confirmed that its next Coordinated Enforcement Framework, to take place during 2025, will be on the implementation of the GDPR right to request erasure.
EDPB reports on the outcomes of its right-of-access Coordinated Enforcement Framework
The European Data Protection Board ("EDPB") has published a report on controllers' implementation of the GDPR right of access, setting out the outcome of a series of national actions that were carried out in 2024 under the EDPB's "Coordinated Enforcement Framework", in which 30 national data protection authorities ("DPAs") across Europe opened coordinated investigations into the compliance of controllers with the right of access. In total, 1,185 controllers, from varying sectors and of sizes ranging from SMEs to large companies, responded to the fact-finding exercise that formed the initial stage of the Coordinated Enforcement Framework.
Two thirds of participating DPAs evaluated the level of compliance amongst responding controllers, in respect of the right of access, as being between "average" and "high". It was noted that larger-sized controllers and controllers, receiving more subject access requests, were more likely to have reached a higher level of compliance than smaller organisations. Other positive findings included the implementation of best practices by controllers such as the use of user-friendly online forms that allow individuals to easily submit an access request.
The report also identifies seven challenges facing more widespread implementation of, and greater compliance with, the right of access. These challenges include a lack of documented internal procedures to handle access requests and the barriers that individuals may encounter when exercising their right to access. The report provides a list of non-binding recommendations to be taken into account by controllers and DPAs in respect of each identified challenge.
Following this, the EDPB confirmed that its next Coordinated Enforcement Framework, to take place during 2025, will be on the implementation of the GDPR right to request erasure.
Ofcom publishes age-assurance implementation guidance
The UK's communications regulator, Ofcom, has published guidance on how it expects services to implement "highly effective age assurance in practice" under the Online Safety Act 2023.
This follows initial proposals published in December 2023 for steps that services falling under Part 5 of the Online Safety Act ("Part 5 services"), such as providers of adult content, should take; as well as a consultation that took place in May 2024 in which the regulator consulted on its approach to children's access assessments.
Following publication of Ofcom's guidance, all relevant online providers must now take action in order to begin complying with the new rules. Part 5 service providers will have until July 2025 to implement "highly effective age assurance" to ensure that children are not "normally" able to access their content. Meanwhile, providers of user-to-user and search services that are in scope of Part 3 of the Act must carry out a children's access assessment by 16 April 2025.
This guidance will be followed up in April of this year by Ofcom's upcoming "Protection of Children" statement that will set out their decisions on the Protection of Children Codes, alongside other guidance.
Once these deadlines have elapsed, Ofcom has warned of strict enforcement action against those that do not comply.
ICO presses ahead with "public sector approach"
In a statement published in December 2024, the ICO has said that it will continue using the "public sector approach" to enforcement where breaches of data protection law by public bodies are alleged or found to have taken place.
The "public sector approach" was first introduced in 2022 on a trial basis and has been subject to a post-trial review, the results of which have now been published. The approach entails the Commissioner exercising his discretion to a greater extent than would normally be the case when considering the level of fine that should be imposed on an entity found to have breached data protection law, when that entity is a public body – on the basis that any fine imposed would constitute public money moving from one public body (the breaching entity) to another (the regulator).
For example, the approach has led to a reduction in the fine imposed on the Police Service of Northern Ireland in October 2024 (which we covered in more detail in a previous edition) from £5.6 million to £750,000.
Following the post-trial review, the Commissioner has decided to continue using the approach. The Commissioner has, however, committed to providing "greater clarity on [the] parameters" of the approach in future, including on which entities fall within its scope. A consultation concluded on 31 January and will inform the ongoing use of the approach.
UK DSIT announces revised data protection fees
On 16 January, the UK Department for Science, Innovation & Technology ("DSIT") announced the outcome of its consultation on the data protection fee regime. The consultation, which ran from August to October 2024, sought views on the proposal to increase the fees payable by controllers to the ICO by 37.2%, in order to provide the regulator with the necessary funding to discharge its responsibilities.
DSIT received 103 responses as part of the consultation, including a response from the ICO itself. There was general recognition of the need to resource the ICO properly, but respondents also highlighted a need for further clarity on how the proposed fee increase will deliver value for money and how the resources will be allocated to ensure an improvement in the level of service provided by the regulator.
DSIT has now decided on a 29.8% hike in fees, alongside retaining certain exemptions and reductions already in place. The new fees are expected to come into effect from early 2025 following the passage of enabling legislation, and are set out below:
Tier | Current Fee | New Fee |
1 | £40 | £52 |
2 | £60 | £78 |
3 | £2,900 | £3,763 |
Cyber security
Home Office, NCSC consult on proposals to reduce the threat of ransomware
The UK Home Office and National Cyber Security Centre ("NCSC") have published a joint consultation on proposed measures to reduce the number of ransomware payments being made by UK companies.
Open for responses until 8 April 2025, the consultation asks for stakeholders' views on various proposals, including:
- Extending the prohibition on public bodies making ransomware payments, following a joint statement made in 2023 by member countries of the international "Counter Ransomware Initiative" which stated that "relevant institutions under the authority of our national government" would not, in future, pay ransomware demands. This prohibition would also extend to "critical national infrastructure".
- Requiring companies that intend to make a ransomware payment to notify the government of this intention prior to making the payment – which would be used by the government both to provide more comprehensive support to organisations finding themselves in this position, as well as more effectively preventing any payments being made where this would violate terrorist financing or sanctions legislation.
- Requiring companies and individuals to report all ransomware attacks to the government regardless of any intent to make a payment. The consultation asks whether it should only be companies and individuals above a certain threshold which should be subject to this requirement.
International cybersecurity authorities publish joint guidance for organisations on secure operational technology products
The NCSC, alongside partner organisations from various other countries including the United States, Australia, Canada, Germany, the Netherlands and New Zealand, as well as the European Commission, has published guidance for critical infrastructure organisations seeking to improve the cyber resilience of their "operational technology" (or "OT").
The guidance aims to assist owners and operators of OT in supplier selection - choosing products, manufacturers and developers which incorporate "security by design" to minimise the risks posed to critical infrastructure by cyber attacks.
'Major incident' sees US Treasury officials accuse China of cyber attack
Officials in the US Treasury Department have accused China of carrying out a cyber attack in which it gained access to employees' workstations and some unclassified documents.
First disclosed in a letter sent to US senators by the Treasury Department, the breach allegedly originated in a third-party service provider that provided technical support to Treasury employees and which was compromised in the attack, permitting downstream access.
Enforcement and civil litigation
European Commission ruled to have breached GDPR in US personal data transfers, ordered to pay damages
The General Court of the CJEU has ruled that the European Commission breached its obligations under the data protection regime that applies to the EU institutions and contains data transfer provisions that are equivalent to those of the GDPR ("EU Institutions' Regulation"). The Commission was ruled to have allowed the transfer of website visitors' data to Meta, in the US, without putting appropriate safeguards in place.
The judgment, handed down in January, follows an action brought by an individual against the Commission in June 2022. The individual claimed that data was illegally transferred to third countries after he had visited a Commission-managed website and registered for an event using his Facebook account. Additionally, the individual claimed that the Commission failed to respond to his requests for information pertaining to the data transfer.
The court found that the Commission did not "comply with the conditions set by EU law for the transfer by an EU institution, body, office or agency of personal data to a third country" and ordered the commission to pay €400 in damages to the affected data subject. The decision has raised speculation about the potential for similar claims focused on breaches of the GDPR itself.
Commission reprimanded for breaching data protection obligations in targeted advertising campaign
In a further ruling, marking the second time in as many months that the Commission has been found to have breached the EU Institutions' Regulation, the European Data Protection Supervisor ("EDPS") ruled in December that the Commission had failed to comply with GDPR requirements in undertaking a targeted advertising campaign aimed at users of social media platform X (formerly Twitter).
The European Centre for Digital Rights originally filed a complaint against the Commission in 2023, claiming that it had illegally processed sensitive data when targeting users of the platform with a campaign in support of proposed new regulations aimed at combatting child sexual abuse content. The targeting of the relevant advertising excluded users that had expressed interest in political parties such as the English Defence league, certain other far-right political parties, or Eurosceptic or nationalist political views.
In December, the EDPS confirmed that in so doing, the Commission had illegally processed sensitive personal data. The EDPS disagreed with the Commission's argument that it was entitled to rely on "public interest" as a lawful basis for processing that data; additionally it found that the data subjects could not have reasonably foreseen that the "public interest" basis would have been relied on by the Commission in that context. As the Commission could not rely on public interest as a lawful basis, it would have been obliged to obtain data subjects' consent for the processing – which did not occur.
The EDPS has issued a reprimand, and has given the Commission three months to inform the EDPS of measures it has taken in light of this decision.
Court of Appeal upholds High Court ruling dismissing class action against Google for misuse of private information
In a ruling which appears to severely limit the prospects for bringing "opt-out" group actions in the UK for misuse of private information, the Court of Appeal has upheld a High Court judgment that dismissed an attempted class action against Google.
The case stems from DeepMind and the Royal Free London NHS Trust's 2015 agreement, under which DeepMind was provided with patient data for more than 1.6 million patients. The class representative, Andrew Prismall, attempted to bring an opt-out group action against Google, seeking damages for loss of control of patient data.
This represented an attempt to use an alternative route for data protection class actions following the Supreme Court's 2021 decision in Lloyd v Google, with the a claim being in the common-law tort of misuse of private information. However, Mrs Justice Williams did not allow the case to proceed to trial, ruling that there was no prospect of it succeeding as it could not be said that all the members of the represented class had "the same interest" in the claim. This was because different patients' individual circumstances would affect the extent to which each patient would have had a reasonable expectation of privacy in respect of the information they shared with the Royal Free Hospital. Some claimants may have done, but the "lowest common denominator" of the would not have had any real prospect of succeeding in any claim. This meant that the class as a whole could not form the basis of any action.
The Court of Appeal, upholding the High Court ruling, found that whilst there would normally be a reasonable expectation of privacy in respect of patient information disclosed to healthcare professionals, this did not necessarily hold true in every instance.
This case, together with Lloyd v Google, seems to render remote any prospects of successfully bringing an opt-out class action in the UK in respect of personal data misuse.
High Court rules that problem gambler did not, and could not, freely consent to processing of personal data by gambling platform
In a case brought against Sky Betting and Gaming, the High Court has found that a gambling addict did not, and could not have, freely consented to the use of tracking cookies to collect and use his personal data for subsequent targeted advertising campaigns and profiling where that processing was carried out by a betting platform.
Mrs Justice Collins found that the circumstances of the case, in which the data subject had struggled with problematic gambling for many years and had made several attempts to "self-exclude", meant that – notwithstanding that the data subject had indicated to the betting platform through its cookie banner that he consented to having cookies placed on his device – this could not amount to freely given and informed consent, to GDPR standard. The judge held that the data subject had not put his mind to the nature of the processing he was ostensibly consenting to, and that he effectively clicked through the platform's consent mechanisms to "get rid" of a distraction that got in the way of him continuing to gamble – indeed, that his approach to the platform's marketing and data collection efforts were "intimately bound up with" his gambling addiction.
Although the conclusions reached in this case are highly fact-specific, they illustrate the potential dangers involved with "defective" consent, involving users apparently giving their consent to (for example) the placement of targeted advertising cookies on their devices whilst in fact being in a frame of mind that causes their agreement not to amount to the standard of freely-given informed consent required by the GDPR.
High Court rules against HMRC in case with wider importance for what is considered "personal data"
The High Court has ruled against HMRC in respect of its response to a subject access request made to it by Sports Direct chief executive Mike Ashley, finding that HMRC adopted too narrow a definition of personal data, and thereby had effectively failed to adequately respond to the request.
The subject access request in question was made by Mr Ashley in 2022, seeking access to his personal data held by HMRC in connection with a tax dispute. HMRC provided Mr Ashley with some of the personal data it held on him in response to the request, but a dispute arose as to whether it had provided all the information required.
Mrs Justice Williams, sitting in the High Court, ruled that it was not the case that all data relating to HMRC's tax investigation into Mr Ashley was his personal data simply by reason of the fact that the investigation overall concerned him. The relevant question was whether any given piece of information itself was "related to" Mr Ashley.
However, the court also found that the criteria used by HMRC to determine whether information was related to Mr Ashley was too narrow. HMRC had made the determination on the basis of whether information in which Mr Ashley was identified or identifiable was "sufficiently proximate" to him. However, the Court held that information will amount to personal data if it is linked to the data subject by reason of its content, purpose or effect. For example, valuation figures relating to each of the properties owned by Mr Ashley and caught up in the tax dispute were, according to the Court, personal data.
Additionally, the Court held that the subject access request made by Mr Ashley was not confined to particular agencies or departments within HMRC, and that HMRC had failed to make available to Mr Ashley personal data processed by some of its internal agencies (including the Valuation Office Agency).
CJEU clarifies that intent is key when assessing whether a subject access request is "excessive"
The CJEU has issued a ruling clarifying the criteria that will determine when a subject access request under the GDPR can properly be considered "excessive", justifying a refusal to grant access.
The case originated in a subject access request made to the Austrian data protection authority, one of 77 such requests made by the same individual within the preceding two years. The authority rejected the request, citing that it was "excessive" – and, in concluding that this was the case, it relied on a purely numerical threshold for the number of requests within a given time period, above which point it would treat any further requests as excessive.
The CJEU's ruling makes clear that a simple threshold of this sort is insufficient, and that it is not merely the number of requests that are made, but also the intent behind the requests, that is determinative. Setting a purely numerical threshold could, the Court found, carry the risk of undermining GDPR data subject rights – such as if an individual was compelled to make a great many requests because of having received repetitively inadequate responses to prior requests.
The CJEU instead ruled that what is required is that the individual making the requests must have an "abusive intention" in order for those requests to properly be deemed "excessive". Furthermore, it is the supervisory authority's burden to demonstrate that a request, or set of requests, is excessive, rather than the reverse.
General Court finds EDPB was entitled to order Irish DPC to conduct additional investigations into Meta
In a ruling handed down on 29 January, the General Court of the CJEU has ruled that the EDPB was within its power in instructing the Irish Data Protection Commission (the "DPC") to carry out further investigations, in the context of an ongoing DPC investigation and draft decision against Meta-owned Facebook and Instagram.
The DPC had been investigating, in the capacity of lead supervisory authority under the GDPR "one-stop-shop" mechanism, whether it had been lawful under the GDPR for Facebook and Instagram to rely on "performance of a contract" as a lawful basis for processing the personal data of its users to target online advertising.
When the draft decision was referred to the EDPB, it ordered the DPC to carry out additional investigations that it had previously treated as out-of-scope – in particular, to look into whether the platforms had processed special category data as part of the conduct that was subject to the DPC's investigation – and following this, to produce a new draft decision that considered this point.
The DPC appealed on the basis that the EDPB had exceeded its authority in making such an order. The General Court's ruling, however, affirms that the EDPB is entitled to require national data protection authorities to expand the scope of investigations past what the authority itself had intended to pursue.
The ruling may be appealed to the European Court of Justice. At time of writing, the DPC has not stated publicly whether it will seek to do so.
Meta fined €251 million by Irish DPC over Facebook data breach in 2018
Facebook parent company Meta has received a €251 million fine for a data breach in 2018 that affected around 29 million Facebook accounts.
The fine was issued by the Irish DPC, which has also previously handed Meta a $1.3 billion fine - the largest in GDPR history. In a press release, the DPC set out its findings that Meta infringed Articles 33(3), 33(5), 25(1) and 25(2) of the EU GDPR as a result of the 2018 data breach.
The incident concerned a vulnerability within a video-upload facility rolled out on Facebook in mid-2017. If used in a certain way, in combination with the "view as" feature on the platform which enabled a user to see their profile as another user would see it, the video uploader would generate a "fully permissioned user token" which could be exploited to enable that user to gain full access to multiple other individuals' Facebook profiles and all the personal data contained thereon.
Speaking about the fine, DPC Deputy Commissioner Graham Doyle commented: "This enforcement action highlights how the failure to build in data protection requirements throughout the design and development cycle can expose individuals to very serious risks and harms".
OpenAI fined €15 million by Garante over data protection failings, says it will appeal
The Italian data protection authority, the Garante, has fined OpenAI €15 million over a series of data protection failings connected with the rollout of ChatGPT in Italy, and has also ordered OpenAI to carry out a public awareness campaign across various media, to "promote public understanding and awareness of how ChatGPT operates" as well as data subjects' rights under the GDPR.
OpenAI has said that it will appeal the decision, arguing that the level of the fine imposed is "disproportionate" and that it will stymie Italy's ambitions for AI development.
We reported in more detail on the fine against OpenAI and its intention to appeal in January's edition of Neural Network, which you can read here.
The Garante has also moved to order Chinese AI startup DeepSeek to block its chatbot in Italy over personal data concerns, following what it called "totally insufficient" responses from DeepSeek to questions put to it by the regulator.
CJEU rules that French national rail company breached GDPR by collecting gender data for online ticket purchases
The CJEU has ruled that French national railway company SNCF breached GDPR by collecting gendered "titles" – such as Mr and Ms – from users of its online ticket-purchasing platform.
The case follows a complaint made by a French association called "Mousse" to the French data protection authority, the CNIL, that the SNCF online ticket-purchasing platform required individuals to select either a "Sir" or "Madam" option in order to complete a purchase. The CNIL initially dismissed the complaint, but this was appealed to the French Conseil D'Etat, which made a reference to the CJEU for a preliminary ruling on the issue.
The CJEU ruled that the practice of collecting this data, and moreover mandating its collection in order to complete a purchase, breached GDPR. This was because there was no lawful basis on which SNCF could rely for the data collection – neither "legitimate interests" nor "contractual performance" were found to be applicable, as the objective for the processing could reasonably have been accomplished without needing to collect this data.
The data collection was also held to breach the GDPR principle of data minimisation – that is, more personal data was being collected than was strictly necessary for the purpose for which the processing was being carried out – as well as the transparency principle.
Noyb brings complaints against TikTok and five other Chinese tech firms, alleging unlawful transfers of Europeans' personal data to China
Noyb has filed complaints against six Chinese companies for alleged violations of the EU GDPR, claiming that these companies are unlawfully sending personal data outside of the EU to destinations including China.
Noyb lodged complaints in five European countries after the six companies failed to respond adequately to subject access requests made under Article 15 of the EU GDPR. It has requested that the EU data protection authorities order the suspension of data transfers to China and require the companies to comply with the EU GDPR. Additionally, Noyb has asked for fines to be imposed of up to 4% of the companies' global revenue.
Kleanthi Sardeli, a lawyer at Noyb, said that "given that China is an authoritarian surveillance state, it is crystal clear that China doesn’t offer the same level of data protection as the EU. Transferring Europeans’ personal data is clearly unlawful – and must be terminated immediately."
Round-up of enforcement actions
Company | Authority | Fine/enforcement action | Comment |
Illumia | Italy | €678,897 | Fine against an electricity and gas supplier for illegal processing of personal data following user complaints of nuisance, unwanted phone calls. |
Sambla Group | Finland | €950,000 | Provider of loan comparison services breached data protection law through security flaws that allowed third party access to customer personal data. |
KASPR | France | €240,000 | Sales lead provider's collection of personal data for Chrome browser extension database found to have breached GDPR. Customers of the breaching company used personal data stored on the database to contact individuals for commercial purposes. |
Netflix | Netherlands | €4.75 million | Netflix found to have failed to inform customers with sufficient clarity what it does with personal data collected from them. Netflix has reportedly made an initial objection to the fine to the Dutch DPA. |
Panek | Poland | 1.5 million Polish Złoty (Approximately €360,000) | Data breach incident which led to customer and employee data becoming publicly accessible via Google. |
Unnamed hospital | Belgium | €200,000 | Hospital failed to have sufficient data protection measures in place, meaning that when it suffered a ransomware attack, personal data of over 300,000 people was compromised. |
ESL Consultancy Services | UK | £200,000 | Use of a "ping tree" lead generation method in which an individual's data, entered into an affiliate website offering personal loans, is "pinged" to other affiliates. The company fined made no effort to ensure consent was in place for the relevant data. |
Key US updates
US Department of Justice issues final rules restricting access to American personal data by countries posing national security concerns
On December 27 2024, the US Department of Justice ("DoJ") issued a final set of rules imposing a series of restrictions and prohibitions on access to "bulk sensitive personal data" pertaining to US citizens by "countries of concern".
The rules, which give effect to Presidential Executive Order 14117, are designed to ensure that this personal data can no longer be sold to potentially hostile foreign states or to connected persons which may be "leveraged" by those hostile actors.
The final rules take effect 90 days following formal publication in the Federal Register, with certain ancillary measures including due diligence and reporting requirements coming into effect after 270 days. The final rules are available to read in full here.
Apple settles US litigation over third-party sharing of "Siri" voice recordings for $95 million
Apple has entered into an agreement to settle a lawsuit originally filed in 2019, over allegations that its voice-assistance software, Siri, was recording users' conversations and sharing that data with third parties without obtaining relevant consent.
According to the plaintiffs, Apple was illegally recording private conversations where there had been accidental Siri activation, i.e., where no "hot word" was used or button pressed, despite Apple representing to its users that Siri could only be activated using a relevant "hot word" (such as "hey Siri") or by pressing a button on a device.
In June 2024, US Magistrate Judge Sallie Kim found that the tech company had failed to take reasonable steps to preserve evidence that was relevant to the lawsuit and that it should be up to a jury to decide whether Apple had done so intentionally.
As part of the settlement agreement, the tech company has agreed to pay $95 million as well as agreeing to commit to non-monetary relief such as deleting audio recordings and setting out public guidance to explain its "Improve Siri" programme – including how it stores data collected from users who have opted into this programme.
In a motion for preliminary approval of the settlement, the plaintiffs proposed that the settlement class should consist of "all individual current or former owners or purchasers of a Siri Device, who reside in the US and its territories, whose confidential or private communications were obtained by Apple and/or were shared with third parties as a result of an unintended Siri activation between September 17, 2014 to the Settlement Date". This could amount to tens of millions of individuals, with each settlement member receiving up to an estimated $20 per relevant device.
US State privacy laws update
State | Date | Law |
Oklahoma | Introduced to State legislature on 13 January 2025 with first reading to follow on 3 February. | Senate Bill 546 (Data privacy: establishing consumer rights for processing of certain data). |
New Jersey | Effective Date: January 15 2025 | New Jersey Data Privacy Law (previously Senate Bill 332). |
Our recent publications
If you enjoyed these articles, you may also be interested in the latest in our series on the key provisions of the Data (Use and Access) Bill – this time looking at the proposed changes to the structure and powers of the ICO – which you can read here.