Data Protection update - February 2025

Data Protection update - February 2025

Welcome to the Stephenson Harwood Data Protection update, covering the key developments in data protection and cyber security law from February 2025.

In data protection news, the ICO has published new guidance on data protection considerations relating to storing employee records; the EDPB has published a statement setting out key principles for data protection in the context of age verification; various legislative developments have seen two data-related Bills currently before Parliament move from the Commons to the Lords, as well as the withdrawal of the long-stalled ePrivacy Regulation at the EU level; and increased data protection fees in the UK have come into effect.

In cybersecurity news, Apple has disabled a key end-to-end encryption feature for UK iPhone users after reports of a Governmental order to permit access under the Investigatory Powers Act; and the UK Department for Science, Industry and Technology ("DSIT") has published a new code of practice for AI cyber security.

In enforcement and civil litigation news, the ECJ has ruled that fines against subsidiary companies under the GDPR must take into account the worldwide revenue of the group as a whole; and regulatory action against AI model developer DeepSeek continues to gather pace in the EU following the Garante's decision to ban DeepSeek from operating in Italy or processing Italian users' personal data.

Data protection

Cyber security

Enforcement and civil litigation

Data protection

ICO publishes employee records storage and retention guidance

The ICO has published guidance on "Employment practices and data protection: keeping employment records". This guidance, which is intended to be read alongside other published ICO guidelines on data protection and employment, aims to help employers understand their obligations under data protection law.

The guidelines outline the type of records that employers may need to keep about their workers (such as training records and pension information), explains that employers must identify a lawful basis under which they can retain their employees' personal information, and identifies the particular lawful bases that, in the ICO's view, are relevant in an employment records context. The ICO highlights necessity for a contract with a worker, legal obligation, legitimate interest and vital interests, although the ICO does note that all lawful bases may apply.

The guidance outlines the need for employers to identify the minimum amount of personal data they need to hold for their workers, and not to exceed that amount. It is also the employer's responsibility to take reasonable steps to ensure the accuracy of that personal data and that it is up to date. Finally, the regulator gives guidance on how employers should determine how long to hold onto their employees' personal data to uphold the storage limitation principle, what must be done to keep this data secure, and what rights employees have as regards the information they are provided with about the use of their personal data and their right to access their data.

EDPB publishes "age assurance" statement and guidance

The European Data Protection Board ("EDPB") has published a statement setting out a series of key principles for ensuring data protection compliance when processing data for the purposes of age verification.

Adopted during the EDPB's February plenary session, the statement acknowledges increasing requirements on organisations to carry out age verification in respect of their users, coming from various regulatory quarters, including the Audiovisual Media Services Directive, the Digital Services Act, and the GDPR itself.

In this context, the statement seeks to "reconcile the protection of children and the protection of personal data in the context of age assurance".

The statement sets out ten high-level principles, each accompanied by more specific guidance. These include, among others, that:

  • "Age assurance should always be implemented in a risk-based and proportionate manner that is compatible with natural persons’ rights and freedoms";
  • "Age assurance should not lead to any unnecessary data protection risks for natural persons … in particular, age assurance should not provide additional means for service providers to identify, locate, profile or track natural persons"; and
  • "Age assurance should demonstrably achieve a level of effectiveness adequate to the purpose for which it is carried out."

The full statement can be found here.

UK data protection fees increase becomes effective

Following the Government's response in January to its consultation regarding the ICO fees regime, the increased fee levels came into force on 17 February.

The fees have now been increased by 29.8%, with the annual fees for each tier now standing at:

  • Tier 1 (micro-organisations) - £52
  • Tier 2 (small and medium organisations) - £78
  • Tier 3 (large organisations) - £3,763

Recent EU and UK legislative developments

February has seen various developments in proposed legislation in both the UK and the EU:

Commission withdraws draft ePrivacy Regulation from legislative consideration

The European Commission has announced that it will not be taking forward the long-stalled ePrivacy Regulation.

The draft Regulation, originally proposed in 2017, was intended to accompany the GDPR. It would have updated the 2002 ePrivacy Directive, introducing new standards and requirements for electronic and online communications and data protection, for example on cookies.

The Commission has now formally indicated that the Regulation will not be proceeding any further, citing the lack of any prospect of political agreement or consensus on the proposals. Differences of opinion were primarily focussed on the balance to be struck between implementing sufficiently future-proof technical measures and specific controls to protect and uphold data privacy, and avoiding overly restrictive measures being places on businesses.

Alongside announcing the withdrawal of the ePrivacy Regulation, the Commission also announced that the mooted "AI Liability Directive" will also not be proceeding. We covered that announcement in the last edition of our AI-focused newsletter, Neural Network, which you can read here.

DUA Bill reaches the House of Commons, amendments approved by House of Lords

Changes to the UK's data protection regime have come a step closer after the Data (Use and Access) Bill made its way across to the House of Commons last month.

The Bill, which began life in the House of Lords in October last year, has now passed all its legislative stages in that House, and will be considered by the House of Commons, where it has already passed the (purely procedural) "first reading" stage.

Several amendments were made to the Bill during the Lords' consideration of it, and these amendments now form part of the Bill as it will be considered by the other House.

These amendments include:

  • Creating new offences in respect of AI-generated sexually explicit "deepfakes";
  • Amending the provisions in the Bill that will ease the consent requirements for data processing when this is for scientific research purposes, to require that such research is "in the public interest" before the relaxed requirements can apply;
  • Adding an additional duty on the part of the ICO to have regard to the particular needs of children in respect of protecting personal data;
  • Extending the direct marketing "soft opt-in" to charities – which says that where an organisation has an existing relationship with a data subject through the sale of goods and services, direct marketing can be sent to that data subject as long as an option to opt-out is given; and
  • A new ICO duty to regulate the transparency of web crawlers used for data scraping, with a view to giving IP owners and "creatives" greater capacity to enforce their IP rights.

To accompany the Bill's passage to the Commons, the ICO has also published its updated response to the Bill. The regulator had already largely welcomed the Bill when it was first introduced into Parliament, and the ICO's responses to all the amendments added in the House of Lords are similarly positive.

We have published the first three articles in an ongoing series that examines in detail the provisions in the draft Bill and their potential impact on the UK's data protection regime. You can find the latest article here, and the full series can be accessed here.

Public Authority Algorithmic and Automated Decision-Making Systems Bill moves to House of Commons

A new Bill to regulate automated decision-making systems as used in the UK public sector was introduced in the House of Lords in September of last year. Following completion of its "third reading" stage in the Lords, this Bill has also now moved to the Commons.

International data protection authorities sign joint AI data governance declaration

On 11 February, the ICO, along with the Irish, French, Australian and South Korean data protection authorities, signed a declaration committing to develop a joint AI data governance framework.

In the declaration, the various authorities recognise the significant opportunities that AI presents, but also highlight the considerable risks it entails (calling out risks of discrimination, misinformation and hallucination that are often caused by inappropriate data processing). Bearing this in mind, the declaration commits the authorities to:

  • Foster their "shared understanding of lawful grounds for processing data in the context of AI training" in their respective jurisdictions;
  • "Exchange information and establish a shared understanding of proportionate safety measures";
  • Monitor "both the technical and societal implications of AI" and "leverage the expertise and experience" of the authorities and other organisations and bodies, so far as possible, when AI policy is being considered and made;
  • "Reduce legal uncertainties and secure space for innovation where data processing is essential for the development and deployment of AI"; and
  • Strengthen their interactions with other relevant authorities to "facilitate consistency and foster synergies between different applicable regulatory frameworks" as they apply to AI systems.

The declaration is available to read in full on the website of the Office of the Australian Information Commissioner, here.

ICO launches direct marketing advice generator

The ICO has launched a new tool aimed at assisting organisations that carry out direct marketing to comply with UK data protection and privacy laws – in particular, the UK GDPR and the Privacy and Electronic Communications Regulations 2003.

Particularly aimed at small companies, the new "Direct Marketing Advice Generator" comprises a questionnaire in several stages, following which "advice" is made available to the completing organisation to correspond with the answers given.

For the time being the tool is in "beta" phase and open for feedback, which can be submitted here.

Cyber security

Apple disables end-to-end encryption feature in the UK after reports of Government order to permit access

Following news reports that the UK Government had ordered Apple to create a "backdoor" to allow it to access fully encrypted files, Apple has now confirmed that it will instead be disabling its highest level of data protection for UK iPhone users.

Apple's "Advanced Data Protection" feature incorporates end-to-end file encryption such that only the user can read the files, and then only on their own devices. It was reported in February that the Government had issued an order under the Investigatory Powers Act 2016 ("IPA") to create a means by which the Government could access these files.

In response, Apple announced that the Advanced Data Protection feature will no longer be available to UK users. This means that UK customers will no longer have the option to have their data stored on Apple's iCloud storage service fully encrypted. The option is no longer available to new users, whilst users who had already turned the feature on will be compelled, over time, to disable it.

As orders under the IPA are normally kept confidential, it is not public whether similar orders have been issued to other technology service providers, nor is there information available about the number or scale of IPA requests that have been made for Apple users' data that is not stored using Advanced Data Protection.

DSIT publishes AI cyber security code of practice

DSIT has published a new "Code of Practice for the Cyber Security of AI", which aims to protect artificial intelligence systems from cyber threats, and will be used to help create a global standard in the European Telecommunications Standards Institute.

The code, which is voluntary, applies to AI systems incorporating deep neural networks and provides guidelines for securing these AI systems across the span of their lifecycle. It sets out how organisations using AI can protect themselves from cyber threats including AI attacks and system failures. This includes steps such as implementing cyber security training programmes which are focused on AI vulnerabilities, developing recovery plans following potential cyber incidents, and carrying out robust risk assessments. DSIT have also created an implementation guide which aims to support organisations with adhering to the requirements of the code. It should be noted that the code will not apply to AI systems that have been developed in the context of academic research, and which will not be deployed.

The code is structured around 13 overarching principles (including requirements to "raise awareness of AI security threats and risks" and "secure your infrastructure"), and goes on to set out relevant standards and publications for each principle alongside more detailed guidance for implementing these principles in practice.

The code is available in full here.

Enforcement and civil litigation

ECJ rules that fines against subsidiaries must account for groupwide global revenue

The European Court of Justice ("ECJ") has issued a judgment clarifying how the maximum level of fine under the GDPR should be calculated for subsidiary companies within a wider group, following a preliminary reference by the High Court of Western Denmark to interpret the meaning of "undertaking" under Article 83 of the GDPR.

Article 83 sets the conditions governing how national data protection authorities can impose administrative fines for breaches of the GDPR. Fines under Article 83, when levied on an "undertaking", are capped at either 2% or 4% (depending on the GDPR provision breached) of that undertaking's total worldwide annual turnover for the preceding financial year.

The ECJ has now ruled:

  • That the maximum fine should be determined based on a percentage of the undertaking's total worldwide annual turnover from the previous business year; and
  • That the definition of "undertaking" must be understood in accordance with articles 101 and 102 of the Treaty on the Functioning of the European Union (which set out a series of EU competition laws). Drawing on the concept of an "undertaking" as set out in those treaty provisions, the court held that this encompasses "any entity engaged in an economic activity, irrespective of the legal status of that entity and the way in which it is financed".

The effect of this is that the overall group's worldwide annual turnover for the preceding year, not just that of the subsidiary, is used in calculating the maximum size of the fine.

Additionally, the ECJ held in this case that this concept of an "undertaking", and the position of a subsidiary company in the context of the broader group it sits within, is relevant not only for calculating the cap, but also in considering the size of the actual fine to be imposed. That is, whether a fine is "effective, proportionate and dissuasive" will depend, in part, on the economic and financial position of the overall undertaking – not just the particular subsidiary entity – against which it is levied.

The High Court of Western Denmark's request was a result of an appeal against a 100,000 kroner (approximately €13,400) fine issued against furniture retailer ILVA in 2021. In its appeal, the Danish Public Prosecutor's Office sought a 1.5 million kroner (approximately €200,000) fine – which had been calculated based off the annual turnover of IVLA's parent company, the Lars Larsen Group. It now remains to be seen how the ECJ's findings will impact the national court's ultimate decision in the case.

Garante bans DeepSeek over data protection concerns, other European regulators investigating

AI model developer DeepSeek is facing intensifying regulatory scrutiny following its explosive entry onto the AI scene in January 2025, including from several EU member state data protection regulators.

We reported in our most recent edition of the Neural Network – which you can read here – on the investigations and other enquiries that DeepSeek was already facing as of mid-February from EU regulators. Most notably, the Italian data protection authority, the Garante, has banned DeepSeek from operating in Italy and from processing Italian users' personal data, after its responses to questions posed by the regulator were described as "entirely unsatisfactory" by the regulator itself.

Since then, the regulatory interest has continued apace. The Belgian and Portuguese regulators have confirmed that they have received complaints relating to DeepSeek; the French, Irish and Luxembourg national regulators as well as several of Germany's regional data protection authorities have variously posed questions to DeepSeek and discussed potential next steps among themselves; and the European Commission itself has stated that it is evaluating whether DeepSeek has violated any EU laws.

$725 million Facebook Cambridge Analytica settlement approved by US court

Facebook parent company Meta Platforms has received approval from a US appeals court for a $725 million settlement.

The settlement is expected to bring to an end to seven years of litigation which arose following the Facebook Cambridge Analytica scandal. The scandal, which first came to light in 2018, saw Facebook acknowledge that as many as 87 million users of its platform had their personal data accessed by an app developer, who had hosted a personality quiz app on the platform which had only been downloaded by 27,000 users.

This led to several lawsuits that were consolidated before US District Judge Vince Chabria, who first approved the settlement amount back in 2023. His approval of the settlement has now been affirmed by a three-judge panel of the US Court of Appeals for the Ninth Circuit.

The panel found that Chabria had properly evaluated the proposed settlement as fair, reasonable and adequate for the national class of affected Facebook users. Additionally, the panel noted in writing that the district court had "properly evaluated the settlement for the 'higher standard of fairness' required of pre-class certification settlements".

Round-up of enforcement actions

Company

Authority

Fine/enforcement action

Comment

E.ON Energia S.p.A.

Italy

€890,000

 

Complaints made by individuals regarding unwanted telemarketing calls, and a lack of response from the company when attempting to exercise their GDPR rights. Failure to establish appropriate legal basis for processing, failure to implement appropriate technical and organisational measures, and failure to respond promptly and adequately to data subjects' requests to exercise GDPR rights.

Centrum Medyczne Ujastek Sp. z.o.o.

Poland

Approx. €271,030

 

(1,145,891 Polish złoty)

 

Illegal surveillance and unlawful video monitoring in a hospital ward, and failure to apply appropriate technical and organisational measures to ensure security corresponding to the level of risk involved in the processing – external memory cards containing personal data stolen and/or lost.

Toyota Bank Polska SA

Poland

Approx. €135,505

 

(576,220 Polish złoty)

 

Profiling of customers using personal data, to determine creditworthiness and credit risks scores, not recorded in register of data processing operations and no data protection impact assessment carried out. Additionally, the Data Protection Officer was situated at a point in the hierarchy where they could not be said to be fully independent.

Generali España

 

Spain

€4,000,000

 

Original fine of €5 million reduced to €4 million for prompt payment. Company suffered a personal data breach due to an attach on its customer maintenance system. >1.5 million individuals affected. Technical failures in the CMS platform permitted unauthorised third party access to personal data of clients and former clients.

Orange España

Spain

€1,200,000

 

Company failed to verify the identity of a person who obtained a duplicate of a customer's SIM card without proper authorisation from that data subject, which was then used to steal money from that individual.

 
Our recent publications

If you enjoyed these articles, you may also be interested in the latest in our series on the key provisions of the Data (Use and Access) Bill – this time looking at provisions in the Bill to enable a range of new smart data schemes across various sectors – which you can read here