Data Protection update - March 2025

Welcome to the latest edition of the Stephenson Harwood Data Protection update, covering the key developments in data protection and cyber security law from March 2025.
In data protection news, the EU Commission proposed a six-month extension to the UK's adequacy decisions; investigations were announced into TikTok, Reddit and Imgur's handling of children's personal data; the ICO announced its plans to relax aspects of the consent rules for UK online advertisers; the European Health Data Space entered into force; the European Commission addressed questions on the EU-US Data Privacy Framework; and the DIFC proposed amendments to its Data Protection Law.
In cybersecurity news, the European Commission has released draft implementing regulations under the Cyber Resilience Act for consultation; Apple's encryption legal challenge was heard behind closed doors, despite calls for a public hearing; and updated guidelines were introduced by cyber agencies to protect edge devices from increasing threats.
In enforcement and civil litigation news, a settlement was reached with a UK individual, under which Meta agreed not to target the individual with advertisements after she exercised her right to object. In EU litigation, the CJEU provided guidance on the right of rectification; confirmed that an entity without legal personality can be a controller; and issued a decision on transparency in automated decision-making.
Data Protection afternoon – London office - Wednesday 2 April 2025
The Data Protection team, along with colleagues from our Regulatory Litigation and Employment teams, as well as guest specialists from Germany and the USA, will be discussing topics including managing processors, strategies for handling data protection investigations, the weaponisation of DSARs, litigation trends and Artificial Intelligence governance.
We hope you can join us for this event, please register here.
Data Protection
- The European Commission proposes a six-month extension to the UK's adequacy decisions
- TikTok, Reddit and Imgur investigated over handling of children's personal data
- The ICO announces plans to relax aspects of the consent rules for UK online advertisers
- European Health Data Space Regulation enters into force
- European Commissioner addresses questions on the EU-US Data Privacy Framework
- The DIFC invites public comments on proposed amendments to Data Protection Law
Cyber Security
- Apple encryption legal challenge was heard behind closed doors despite calls for public hearing
- Updated guidelines introduced by cyber agencies to protect edge devices from increasing threats
- European Commission releases draft implementing regulations under the Cyber Resilience Act
Enforcement and Civil Litigation
- Settlement reached in targeted ads claim against Meta
- CJEU decision on Algorithms and Automated decision-making Transparency
- CJEU provides guidance on the right of rectification
- CJEU confirms that an entity without legal personality can be a controller under the GDPR
- Norwegian company fined for breaching DPO requirements
- Round-up of enforcement actions
Key US Update
Data Protection
The European Commission proposes a six-month extension to the UK's adequacy decisions
The European Commission (the "Commission") has proposed extending its data adequacy decisions relating to the UK by six months, which would allow for continued free and secure data flows of personal data from the EU to the UK until 27 December 2025. The Commission recognised the UK as providing an adequate level of protection for personal data post-Brexit, in 2021.
This extension allows additional time for the UK to complete its legislative process on the Data (Use and Access) Bill (the "Data Bill"), which was introduced in the UK Parliament on 23 October 2024 (see our update on this here). Following this, the Commission will evaluate the UK's new legal framework and, subject to the outcome of that evaluation, will propose a renewal of its UK adequacy decisions under the GDPR and Law Enforcement Directive.
The European Data Protection Board now needs to provide its opinion on the draft extension decisions, as part of the adoption procedure.
TikTok, Reddit and Imgur investigated over handling of children's personal data
Tiktok, Reddit and Imgur are being investigated by the Information Commissioner's Office (the "ICO"); with Tiktok being probed over the ways in which it uses children's personal data, and Reddit and Imgur's age assurance measures being scrutinised. If there is enough evidence to suggest that any of the companies have violated the law, the ICO will contact the platform, allowing them an opportunity to respond before reaching a final decision.
Upholding the information rights of children is a priority area of focus for the Information Commissioner. Tiktok's recommender systems are under scrutiny for arguably advancing inappropriate and even harmful content to 13 to 17-year-olds in the UK. The ICO will consider possible breaches of data protection legislation and the Children's Code (the "Code"). The Code took effect from 2 September 2020 and covers any online product or service that is likely to be used or accessed by individuals who are under the age of 18. The Code aligns with, and is enforceable under, the Data Protection Act 2018.
John Edwards, the UK Information Commissioner, has said of the investigation: "If social media and video sharing platforms want to benefit from operating in the UK they must comply with data protection law. The responsibility to keep children safe online lies firmly at the door of the companies offering these services and my office is steadfast in its commitment to hold them to account."
The ICO announces plans to relax aspects of the consent rules for UK online advertisers
The ICO announced its new pro-growth pledges in March, in which it commits, among other things, to relaxing its enforcement of the cookies requirements under the Privacy and Electronic Communications Regulations ("PECR").
The pledge reads that the ICO will be "relaxing enforcement of consent rules for privacy-preserving online advertising, ahead of exemptions to these legal requirements being introduced by government, where appropriate". This particular pledge is part of the ICO's wider review of PECR, as well as a pilot regime for testing out new privacy-preserving online advertisement models. The pledge appears to anticipate the provisions of the Data Bill that relax consent requirements for non-intrusive cookies and trackers.
The ICO also made a series of other commitments, including a pledge to introduce guidance for businesses that are developing or deploying AI. This aims to enable organisations to "unleash" the opportunities presented by AI, while safeguarding personal data.
The new pledges have met with mixed reactions, with some welcoming a review of PECR to ensure that it is fit for purpose. However, others have questioned the ICO's independence, as these pledges are one of several instances of the regulator overtly aligning itself with the Government's economic growth plans. The ICO has responded to some of the criticism, stating that its "desire to promote innovation does not come at the cost of our duty to protect the public".
European Health Data Space Regulation enters into force
On 26 March the Regulation creating a European Health Data Space ("EHDS") entered into force, following its publication in the Official Journal of the European Union on 5 March. The EHDS Regulation will apply to relevant entities from 26 March 2027.
The EHDS will require all electronic health records to comply with a standard format, with the aim of simplifying the exchange of health data.
The EHDS will provide individuals with a number of additional rights, such as the right to download an electronic copy of their personal electronic health data for free and to insert information into their electronic health records.
The EHDS also brings significant enforcement consequences for infringements. Entities that fall foul of the EHDS could be subject to the following fines:
- a maximum of €10 million or 2% of total worldwide annual turnover in the preceding financial year, whichever is higher; or
- for specific infringements, a maximum of €20 million or 4% of total worldwide annual turnover in the preceding financial year, whichever is higher.
To read more about the EHDS, please click here for an article about navigating the EHDS published on our life sciences hub.
European Commissioner addresses questions on the EU-US Data Privacy Framework
The EU-U.S. digital relationship is poised for change, as U.S. policies shift under President Trump, with the U.S. calling for deregulation as a means to foster digital innovation, and the EU currently considering a more flexible approach to digital regulation. It is unclear how this might impact the future of the EU-U.S. Data Privacy Framework (the "Framework").
European Commissioner Michael McGrath emphasised the EU's commitment to enforce the Framework during a Center for Strategic and International Studies webinar and highlighted a productive meeting with the U.S. Federal Trade Commission Chair, who expressed support for the Framework. His remarks come after data protection authorities in Denmark, Norway, and Sweden issued guidance regarding EU-U.S. data transfers amid the U.S. political shifts. Meanwhile, U.S. Vice President JD Vance criticised the EU's stringent digital regulations at France's AI Action Summit (reported on in our most recent Neural Network bulletin here), warning that over-regulation could hinder innovation, describing the GDPR and Digital Services Act as deterrents to doing business in the EU.
The DIFC invites public comments on proposed amendments to Data Protection Law
The Dubai International Financial Centre (the "DIFC") has requested public input on proposed amendments to its Data Protection Law ("DPL"), DIFC Law No. 5 of 2020, through Consultation Paper No. 1 of 2025.
The proposed amendments aim to refine and elevate data protection standards, aligning them with global benchmarks and international best practices. This places higher compliance standards on businesses.
The amendments seek to specify that the DPL applies to: (i) DIFC-registered entities processing data anywhere; (ii) entities processing data within the DIFC under stable arrangements; and (iii) those handling data of individuals in the DIFC. This adjustment would align the DPL more closely with international standards like the GDPR, significantly broadening its application and scope.
Selected amendments include:
- a Private Right of Action allowing individuals to directly seek compensation through DIFC Courts for data protection violations, bypassing the DIFC Commissioner of Data Protection, who is responsible for the supervision and enforcement of the DPL and supervise the data protection reporting process. Modelled on the UK Data Protection Act 2018 and the GDPR, this change offers greater legal recourse for individuals and heightens compliance pressure on businesses;
- new obligations for data controllers and processors include assessing the availability of legal redress in importing jurisdictions and enhancing the DIFC Commissioner’s role in evaluating the adequacy of third-country data protection; and
- increased fines for breaches, such as $25,000 for failing annual assessments, $50,000 for neglecting Data Protection Impact Assessments (DPIAs) in high-risk processing, and $50,000 for non-compliance with data-sharing obligations.
The consultation has now closed, with final enactment of the amendments anticipated later this year.
Cyber Security
Apple encryption legal challenge was heard behind closed doors despite calls for public hearing
Apple has filed an appeal against a government notice that demanded it grant law enforcement access to data encrypted by its Advanced Data Protection ("ADP") service on iCloud as we reported here last month. The notice was issued under the Investigatory Powers Act 2016, which compels companies to help law enforcement obtain evidence.
The Investigatory Powers Tribunal conducted a secret, day-long hearing regarding the matter, despite calls for the hearing to be held in public. The case was heard by Investigatory Powers Tribunal President Lord Justice Rabinder Singh and High Court judge Jeremy Johnson.
While the appeal to the order is currently in process, Apple disabled its ADP service for UK Customers in February 2025.
Updated guidelines introduced by cyber agencies to protect edge devices from increasing threats
In early February the National Cyber Security Centre ("NCSC") unveiled new guidelines to help manufacturers of edge devices ensure their products are secured against increasing cyber security threats; the guidelines being a coordinated effort with other international cyber security agencies such as those in Australia and the US.
The NCSC highlighted that the purpose of these guidelines is to encourage device manufacturers to include and enable standard logging and forensic features that are both robust and secure by default. This is so network defenders can detect malicious activity more easily and investigate after any intrusions.
The guidelines also set out the minimum standards for forensic visibility to aid network defenders to secure organisational networks.
Speaking on the new guidelines, the technical director of the NCSC, Ollie Whitehouse, said: "in the face of a relentless wave of intrusions involving network devices globally our new guidance sets what we collectively see as the standard required to meet the contemporary threat… we are focused on nurturing a tech culture that bakes security and accountability into every device, while enabling manufacturers and their customers to detect and investigate sophisticated intrusions".
European Commission releases draft implementing regulations under the Cyber Resilience Act
After the entry into force of the Cyber Resilience Act (the "CRA") at the end of last year, the Commission has now released a draft regulation setting out detailed technical descriptions for certain categories of important and critical products with digital elements, fulfilling a requirement of the CRA. The regulation supports the identification of products that may be subject to more stringent conformity assessment procedures than other products with digital elements.
As we reported in our November 2024 article here, the CRA applies from 11 December 2027. It is applicable to organisations involved in all stages of the supply chain of products with digital elements (such as "internet of things" devices) and imposes increasingly stringent measures, depending on the security risk of the product. Products considered "important" (e.g. password managers, routers, VPNs, smart home virtual assistants) must undergo a conformity assessment procedure prior to their placement on the market; and products considered "critical” (e.g. hardware devices with security boxes, smart meters or smart cards) must obtain a European cybersecurity certificate under a European cybersecurity certification scheme.
The technical descriptions for important and critical products with digital elements are set out in annexes to the regulation and the Commission is inviting the public to share their feedback on the proposals here.
The feedback period closes on 15 April 2025.
Enforcement and Civil Litigation
Settlement reached in targeted ads claim against Meta
A significant settlement with the potential to impact Meta's advertising model was reached last week, with Meta agreeing to stop processing the personal data of one UK individual for direct marketing purposes. Tanya O'Carroll successfully challenged Meta for unlawfully using personal data to deliver targeted advertising. Her claim, filed under UK GDPR section 21(3) in March 2022, argued that Meta was not complying with her UK GDPR right to object to targeted advertising and continued to use her personal data for this purpose, resulting in ongoing targeted ads on her social media accounts. O'Carroll emphasised that this use of her data was inconsistent with what users agree to when they create social media accounts, describing it as invasive and a high cost for using these platforms.
Meta defended the claim, stating that users could not expect a business to provide free services; and maintained that it was adhering to UK GDPR, providing the necessary tools for its users to manage advertising preferences.
The Information Commissioner's Office ("ICO") published a statement supporting O'Carroll and maintained her right to object to such processing. The ICO has further stated that individuals may file complaints if they believe their personal data is being unlawfully processed and that it will continue to pursue and engage with Meta on this specific issue. It would seem that this case may set a precedent for future claims against tech giants who are creating revenue from the use of users' personal data. However, it is likely to further push social media giants towards pay or consent models, as they seek to ensure that the business of targeted advertising is commercially viable.
CJEU decision on Algorithms and Automated decision-making Transparency
The Court of Justice of the European Union ("CJEU") has published a significant decision on the right of data subjects to request access to their personal data under Article 15 of the GDPR. This is particularly significant as it clarifies the duty of transparency in the context of automated decision-making.
This decision follows the non-binding opinion issued on the same case by Advocate-General Richard de la Tour in September last year. You can read the Advocate-General's opinion regarding this case, here from our September 2024 data protection update.
The CJEU ruled that in cases that involve automated decision-making, controllers must provide concise, transparent, intelligible and easily accessible explanations of the procedures and principles applied by automated methods to the data subject's personal data that led to a specific result, noting that communicating "a complex mathematical formula" would not suffice.
Additionally, the CJEU recognised that the right to protection of personal data is not absolute and must be proportionally balanced against other rights. The court set out that "wherever possible" the controllers should favour "means of communicating personal data to data subjects that do not infringe on the rights or freedoms of others". If a controller believes that disclosing certain information may violate trade secrets, it must submit the potentially protected information to a competent supervisory authority or court for it to be considered. It will then be the authority or the court that will balance the rights and interests to determine the extent of the data subject's right to access that particular information.
CJEU provides guidance on the right of rectification
In a further judgment handed down on 13 March, the CJEU provided useful guidance on the right to rectification set out in Article 16 of the GDPR.
The case involved an Iranian national who had obtained refugee status in Hungary. To support their application for refugee status, the individual relied on their transgender identity and produced medical certificates to evidence that they were born female, but their gender identity is now male. Despite this, the individual was registered as female in the asylum register.
In 2022 the individual submitted a request, in reliance on Article 16 of the GDPR, to have their entry rectified to change their gender to male and to amend their first name. The Hungarian authority refused the request and the individual sought to annul that decision.
In its judgment, the CJEU ruled that where national authorities are responsible for keeping a public register, they must rectify the personal data relating to the gender identity of a natural person where that data is inaccurate. The court held that Article 16 of the GDPR must be interpreted as meaning that the person requesting rectification may be reasonably required to provide relevant and sufficient evidence in order to establish that the data held is inaccurate. However, a Member State cannot by way of an administrative practice make the exercise of that right conditional upon the revelation of evidence of gender reassignment surgery.
CJEU confirms that an entity without legal personality can be a controller under the GDPR
The CJEU issued a judgment at the end of February providing clarification on the concept of a controller. According to the CJEU, national legislation should not be limited to the definition of controller in Article 4(7) of the GDPR and administrative entities that lack legal personality could be designated under national laws as controllers of personal data under particular circumstances. This is provided that: (i) national law can determine the scope of the entity's responsibilities for the processing of personal data; and (ii) the entity can fulfil controller obligations.
This matter arose when the Austrian data protection authority (the "DSB") received a complaint from an individual against the Office of the Tyrolean State Government of Austria (the "Office"), an administrative entity without legal personality of its own, for its use of personal data to send the individual a vaccination reminder letter. The DSB found that the Office's processing was unlawful, as the Office did not have the right to consult the vaccination register.
Norwegian company fined for breaching DPO requirements
The Norwegian Data Protection Authority ("DPA") fined Telenor ASA for breaching the Data Protection Officer ("DPO") requirements under Articles 37-39 of the GDPR. The investigation revealed that Telenor had terminated its DPO, claiming it did not meet the criteria for a mandatory appointment under Article 37(1) of the GDPR, but did not provide documentation supporting this decision. A temporary DPO was appointed after the investigation began.
The investigation found that the DPO’s role was split between data protection duties and associate lawyer responsibilities, with Telenor unable to produce documentation to confirm that most of the DPO’s time was spent on data protection. Furthermore, a significant backlog of data protection tasks was noted for 2021.
The DPA questioned the DPO’s independence, as the role was combined with an associate lawyer position, and business conducted for DPO purposes went to the same email address as for the legal role. The DPA found violations of several GDPR provisions, including inadequate documentation, failure to involve the DPO in all data protection matters, and a lack of resources for the DPO.
The DPA ordered Telenor to update its Record of Processing Activities, assess the need for a DPO, and implement measures to ensure the role's independence.
The fine of €351,477.64 was imposed for failings under the GDPR and the Norwegian Data Protection Act.
Round-up of enforcement actions
Company | Authority | Fine/enforcement action | Comment |
Orange Espagne, S.A.U. | Spain | €1,200,000 | An individual filed a complaint because the company had given a duplicate of their SIM card to an unauthorised third party without their consent, and the third party turned out to be fraudulent. The company's lack of verification mechanisms allowed the fraudsters to gain access to the data subject's bank account. During its investigation, it was found that the company failed to verify the identity of the third party or obtain the data subject's consent to share their data. |
A Real Estate Company | France | €40,000 | A real estate company was fined by the French Data Protection Authority ("CNIL") for unlawfully monitoring its employees. Employees working in the office were continuously filmed and the company implemented a software program that tracked "periods of inactivity" by taking screenshots of employees' computers while working remotely. The CNIL determined that these surveillance practices were disproportionate and violated data security regulations. The company failed to properly inform employees about the monitoring and neglected to conduct a required data protection impact assessment. |
Key US update
Genomic Data Protection Act proposal introduced to US Senate
At the beginning of March, two US Senators reintroduced the Genomic Data Protection Act to the US Senate for consideration, building on a previous attempt in late 2024. The Genomic Data Protection Act (the "GDPA") would give consumers rights over their genetic data and grant individuals the right to access and delete their genomic data and request the destruction of biological samples, with exceptions for legal obligations such as court orders. Companies would need to provide clear notice to consumers about their rights, including the potential use of their anonymised data for research.
The GDPA would target "direct-to-consumer genomic testing companies", including businesses that manufacture testing products, analyse genomic data, or collect and disclose such data from consumers, with the aim of regulating the genetic data collected by such companies. Violations of the GDPA would likely be enforced by the Federal Trade Commission. The GDPA was referred to the Committee on Commerce, Science, and Transportation on 3 March 2025 and its progress can be tracked here.
Our recent publications
If you enjoyed these articles, you may also be interested in the latest in our series on the key provisions of the Data (Use and Access) Bill, this time looking at provisions in the Data Bill around digital ID verification schemes, with the creation of a new register of Digital Verification Service providers. You can read this here.