Data Protection update - November 2024

Data Protection update - November 2024

Welcome to the Stephenson Harwood Data Protection update, covering the key developments in data protection and cyber security law from November 2024.

In data protection news, the ICO has published new guidance on matters including the rollout of privacy-enhancing technologies by organisations and data-sharing for fraud prevention purposes and the ICO has indicated that it will seek to rely on a recent joint statement it signed with other international regulators on the practice of data scraping in its attempts to revive its enforcement action against Clearview AI.

In cybersecurity news, the EU Cybersecurity Resilience Act has been published in the Official Journal of the European Union; and the ICO has applied for permission to appeal against a recent Upper Tribunal ruling against it relating to a data breach at DSG Retail, which could have significant implications depending on how it is ultimately decided.

In enforcement and civil litigation news, the German Federal Court of Justice has ruled that mere loss of control over personal data can constitute non-material damage, contrasting the approach taken by UK courts; the Italian Data Protection Authority has publicly rebuked a bank for initially downplaying the scope and severity of a data breach when it was first reported to the regulator; and Meta has been fined more than $15 million in South Korea for unauthorised use of personal data.

Data protection

Cyber security

Enforcement and civil litigation

Data protection

Data (Use and Access) Bill continues progress through Parliament

The Data (Use and Access) Bill ("DUA Bill") has continued its passage through the UK Parliament, passing its second reading in the House of Lords on 19 November.

Although the Bill met with a largely positive reception, during the debate, lawmakers nevertheless expressed concerns over a number of its features, including its proposed relaxation of restrictions on automated decision-making, the independence of the new Information Commission (which the Bill would create) from ministerial interference, the breadth of the proposed new statutory definition of scientific research and whether this might be "susceptible to misuse", such as by AI developers, and the Bill's general lack of provisions relating to AI concerns such as data-scraping and the use of copyrighted material.

We have published the first in a series of deep-dive articles on the Bill's key provisions and the ways in which it proposes to change the data landscape in the UK. In our first article, which you can read here, we consider the main ways in which the Bill would, if passed as presently drafted, change the UK's data protection regime.

Subsequent articles in our series, to be published soon, will look at topics including the contemplated changes to the structure of the ICO and its powers of enforcement; provisions in the bill for developing a certification regime for digital verification services; and the Bill's ambitions to remove barriers to smart data initiatives and accessing and using public sector data.

The article series can be accessed at this page on our data protection hub and will be updated as further articles are published.

ICO publishes new tool and accompanying guidance for organisations deploying privacy-enhancing technologies

On 15 November, the ICO published a guide for assessing the costs and benefits associated with Privacy Enhancing Technologies ("PETs"). This guide, along with the regulator's recently released cost-benefit awareness tool, forms part of the ICO's wider effort to encourage companies to adopt PETs as soon as possible.

The new tool, a collaboration between the ICO and the Department for Science, Innovation and Technology, focuses on "emerging PETs"- a term used to describe a set of novel technologies intended to address privacy issues in "data-driven systems".

The guide and the new tool are structured around an example of using PETs for privacy-preserving federated learning, providing a framework for considering the costs and benefits that are associated with a range of PETs. Additionally, the tool contains information on compliance costs and benefits to exemplify how using PETs can reduce the risks to individuals and reduce compliance costs.

These new resources constitute the latest efforts by the ICO to encourage the uptake of PETs, following earlier signals from the regulator that it will be changing its approach to driving the uptake of PETs by organisations generally. Speaking on a panel in April 2024, the ICO's director of technology and innovation, Stephen Almond, made clear that PETs will become an expectation in certain use cases, rather than merely a "nice to have". Almond stated that, in future, "if we find that you have not employed PETs in this particular use case, we will actually take action" and that "as a regulator, we're saying this in advance, and we'll be clear when we've switched modes".

Additionally, the regulator has spent the last few months indicating that companies in sectors such as finance, healthcare and advertising should be particularly vigorous in adopting PETs.

ICO will seek to rely on international joint data scraping statement to revive enforcement action against Clearview AI

In the latest development in the long-running saga of the UK data protection regulator's attempt to impose a regulatory penalty on Clearview AI, the ICO has indicated that it will seek to rely on a joint statement it recently signed alongside numerous other international data protection authorities.

The ICO is currently appealing a ruling by the First Tier Tribunal, handed down in October 2023, that it did not have the power to bring regulatory action or to impose regulatory penalties against facial recognition AI developer Clearview, because its activities fell outside of the scope of the UK's data protection regime. As part of its ruling, the Tribunal overturned a fine of more than £7.5m initially levelled against Clearview in May 2022. The fine was imposed by the ICO after finding that Clearview had conducted data scraping to collect billions of images of individuals' faces from online sources, including UK citizens, and had used this data to train its facial recognition model.

The joint statement that the ICO is hoping will bolster its case concerns data scraping and the privacy and data protection risks associated with the practice. It sets out a series of requirements that the signatory regulators considered organisations engaged in data scraping should comply with.

The ICO is arguing that the joint statement provides evidence that in bringing an action against Clearview, it is not simply asserting jurisdiction over the company in respect of the data it holds about UK citizens, but that there is also an internationally recognised standard, in the form of the joint statement, which it is "entitled to uphold in the protection of our citizens".

The ICO will nevertheless have substantial obstacles to overcome in its efforts to persuade the UK courts and tribunals of its position. Its initial application to the First Tier Tribunal for permission to appeal was recently refused so it has applied directly to the Upper Tribunal for such permission.

ICO concerns over reluctance to share data for fraud prevention, publishes practical guidance for organisations

The ICO has published new guidance for organisations seeking to protect their customers from scams and fraud efforts, clarifying ways in which organisations can share personal information responsibly in the course of these efforts without falling foul of data protection requirements.

The overarching theme of the ICO's guidance is the regulator's concern that there is a "reluctance" among organisations to share personal information for the purposes of scam and fraud detection and prevention. It attributes this to (misplaced) data protection concerns, which may "lead to serious emotional and financial harm" for individuals owing to reduced effectiveness of customer protection efforts.

The ICO is seeking to make clear that "data protection law does not prevent organisations from sharing personal information, if they do so in a responsible, fair and proportionate way" – and the regulator goes so far as to say that data protection law is "not an excuse" for organisations not to share data in the course of these efforts.

The guidance sets out a series of practical steps that organisations can and should take when seeking to share personal data for these purposes. These steps include carrying out a data protection impact assessment, clear assignment of responsibilities between the organisations with whom the data will be shared (in particular, whether the organisations will act as separate or joint controllers), putting data sharing agreements in place, and clearly identifying in advance the relevant lawful basis for the data sharing. The guidance notes that relevant lawful bases that organisations may be able to rely upon, depending on the particular circumstances, include consent, legitimate interests and the performance of a contract.

Information sharing in the regulated sector: reducing or increasing risk?

Related to the story above, the UK Government recently issued guidance on the information sharing provisions contained within Section 188 of the Economic Crime and Corporate Transparency Act 2023.

These information-sharing provisions "disapply civil liability for direct sharing of customer information" where such sharing is done for the purposes of detection, investigation and/or prevention of fraud and other "economic crime". These provisions apply to "all businesses in the anti-money laundering regulated sector".

The Government's guidance on these provisions sits alongside a recent update from the UK's financial regulator, the FCA, which "strongly encourage[s]" regulated firms to engage in information-sharing initiatives for the purposes of tackling financial crime. Notwithstanding the clear governmental and regulatory steer in the direction of such initiatives, however, there are still risks that regulated firms will need to account for when relying on Section 188 to share information in this way. These risks are considered in detail in our recent article, which you can read here.

Cyber security

EU Cyber Resilience Act published

The EU Cyber Resilience Act ("CRA") has been published in the Official Journal of the European Union. This starts the clock on a three-year phased implementation period, with the first obligations under the Act beginning to bite in September 2026.

The CRA applies to various types of organisations involved in all stages of the supply chain of products with digital elements (such as "internet of things" devices) and imposes obligations to ensure that such products are secure to use, provide enough information on their security measures and are resilient against cyber threats.

The measures being introduced by the CRA include mandatory cybersecurity requirements for relevant products, and obligations on the economic operators in the supply chain, with such measures becoming more stringent as the cyber security risk of the product increases.

A possible consequence of non-compliance with these obligations will be fines of up to the higher of €15 million or 2.5% of the offender’s total worldwide annual turnover.

Now that the CRA has been published in the Official Journal of the European Union, it will come into force on 10 December, with its reporting obligations for manufacturers becoming applicable from September 2026 and the entire Act becoming fully applicable from December 2027.

ICO seeks permission to appeal Upper Tribunal's ruling in DSG Retail personal data case

Following an Upper Tribunal ruling overturning an earlier ICO decision against DSG Retail – which we covered in last month's bulletin, available here – the ICO has now confirmed that it has applied for permission to appeal against the Upper Tribunal's decision in the case.

This case has the potential to be significant as it bears upon the exact definition of personal data; specifically, whether the "hands" of the person who has the data needs to be considered when assessing whether data is personal data.

The facts of the case concern a data breach suffered by DSG Retail in which unknown malicious actors gained access to customer payment card details. The ICO initially found that the breached data constituted personal data, but the Upper Tribunal overturned this decision on appeal.

The Upper Tribunal's ruling was that, when considering whether the payment card data constitutes personal data, what is important is whether the malicious actors might reasonably be expected to have separate means available to them to identify data subjects in combination with the payment card details – not, as the ICO had found, whether DSG itself had separate means of identifying data subjects.

The customer payment card data was assessed by the Upper Tribunal as not constituting personal data "in the hands of" the malicious actors, despite it being personal data in the hands of DSG.

Enforcement and civil litigation

Federal Court of Justice of Germany rules mere loss of control over personal data can constitute non-material damage

On 18 November, the German Federal Court of Justice ruled that mere and short-term loss of control over personal data as a result of a violation of the GDPR can constitute non-material damage within the meaning of Article 82(1) of the GDPR. Data subjects suffering such damage could be entitled to compensation and would not be required to prove any noticeable negative consequences, or misuse of the data to their detriment. This decision is in line with the case law of the European Court of Justice but contrasts with the UK Supreme Court's decision in Lloyd v Google from 2021, which you can read more about in our previous article, here.

The German case relates to an April 2021 incident, where personal data of approximately 533 million Facebook users from 106 countries was publicly disseminated on the web. Unknown third parties had taken advantage of functionality that, depending on the searchability settings of a user, made it possible for a user's Facebook profile to be found with just their phone number using the "friend search" function on Facebook. Over the course of 2018 and 2019, unknown third parties entered random sequences of digits into the search function via the contact import function, and scraped the public data linked to the phone number (user ID, first and last name, place of work and gender) from the user accounts. Facebook made changes in 2019 to remove the functionality.

The Federal Court has referred the case back to the Court of Appeal for a new hearing and decision. The Court of Appeal will need to examine, among other things, whether the plaintiff had given consent to Facebook for its data processing, and whether Facebook’s default searchability setting complied with the principle of data minimisation. The Federal court also provided guidance on the assessment of non-material damage and explained why there were no legal objections to assessing the compensation for the mere loss of control at a level of €100. Other aspects of the appeal, such as seeking a declaratory judgment that Facebook should be liable for future material and immaterial damages, as well as asking the court for injunctive relief against processing of the plaintiff's phone number and request for information, were unsuccessful.

The full text of the decision is yet to be published, but the press release may be found here (in German). It indicates a possible divergence between the EU and the UK's approaches to the data protection damages that can be awarded.

Italy's data protection authority issues rebuke to bank for downplaying severity of data breach

The Italian data protection regulator, the Garante per la Protezione dei Dati Personali or "Garante", has rebuked banking group Intesa Sanpaolo in a public statement for failing to inform it of the full extent of a data breach, the number of individuals affected, and the potential risks to the rights of those individuals impacted by the breach.

The breach in question was allegedly caused by an employee of the bank, who is reported to have accessed the data of approximately 3,500 customers and clients of the bank. Media reports have alleged that current Italian Prime Minister, Giorgia Meloni, and her immediate predecessor in the office, Mario Draghi, are both among the individuals whose data was accessed, along with numerous other high-profile individuals.

The Garante has now stated that in making its initial report of the breach, Intesa failed to candidly appraise the regulator of the scale and severity of the breach, which only subsequently came to light from media reporting, which Intesa then confirmed.

The Garante's statement, as reported by Reuters, notes that "contrary to the bank's assessment... the breach of the personal data represents a high risk for the rights and the freedoms of the individuals concerned." The Garante has ordered Intesa to provide it with information as to the particulars of the bank's security measures within 30 days, as well as notifying all affected customers within 20 days.

Preliminary data protection issues considered in Pacini & Anor v Dow Jones Inc.

On 29 October 2024, a ruling was handed down on two preliminary issues in a case we reported on in our July bulletin here. Investment bankers Pacini and Geyer brought a claim against Dow Jones over its publication of two Wall Street Journal articles, which the claimants allege contain inaccurate personal data and improperly processed criminal offence data, in breach of the UK GDPR.

Four months after a High Court judge dismissed Dow Jones's application to strike out the claim, Richard Spearman KC, sitting as the High Court judge, has now ruled on two preliminary issues. In his ruling, Spearman considered: (i) the single meaning of the personal data published by Dow Jones in its articles (in order to determine if it was inaccurate); and (ii) whether it constituted criminal offence data.

It is reportedly to be the first time the Court has been asked to determine a preliminary issue of meaning in a data protection claim alone (as opposed to a claim for defamation). It is worth noting that the claimants missed the deadline to begin proceedings under a defamation claim (i.e. within one year of the defamatory statement being made).

In considering the first issue, Spearman determined the single meaning of the personal data in question, "by considering the [a]rticles as a whole, and interpreting each element of them by reference to the meaning that the hypothetical reasonable reader would take from it, read in its full context." Spearman opted to apply defamation principles for interpreting the natural and ordinary meaning of the words complained of. This approach allowed him to assess the accuracy of the personal data in the news articles. In his determination, Spearman also opted to apply the repetition rule, maintaining consistency with how a defamation claim would be treated. The repetition rule prohibits any kind of reasoning that a statement is less defamatory (or not defamatory at all) simply because it is a report of what someone else has said.

On the second preliminary issue, Spearman sided with Dow Jones on the question of whether the personal data contained in the articles constitutes criminal offence data under the UK GDPR, saying that the "hypothetical reasonable reader" would not conclude that the news article attributes the conduct of receiving secret profits to Pacini, or that it contains personal data relating to the "commission" or the "alleged commission" of any criminal offence by Mr Pacini. Spearman therefore determined that the data published did not constitute criminal offence data; his opinion being further reinforced by the fact that the article makes no mention of "crime" or "criminal proceedings" but instead clearly refers to civil proceedings alone.

We will keep you appraised of material developments as this case progresses.

Meta hit with multimillion dollar fine by South Korean regulator for unauthorised data collection

Facebook's parent company Meta Platforms has been fined $15.67 million by South Korea's data protection agency, the Personal Information Protection Commission ("PIPC") after it found that the company had collected sensitive data of almost a million South Korean users without obtaining consent.

An investigation launched by the PIPC found that Meta analysed users' activities on Facebook to create targeted advertising categories relating to sensitive topics such as sexual orientation and religion. The investigation found that around 4,000 advertisers were able to benefit from this unauthorised data usage, as they were able to tailor ads more effectively to particular users due to the insights resulting from the unauthorised data collection.

Additionally, the PIPC's investigation found that Meta denied requests from users to access their personal information, in contravention of data privacy rights. The company also failed to prevent a hack that compromised the data of around ten South Korean users.

The fine levied by PIPC is one of the most substantial to be imposed against a foreign tech firm in recent years. This is also not the first run-in that Meta has had with international data protection regulators, with fines of similar magnitude being imposed on it in the US and the EU.

Round-up of enforcement actions

Company

Authority

Fine/enforcement action

Comment

Foodinho

Garante per la protezione dei dati personali (Italian DPA).

€5,000,000

The Italian DPA fined this food delivery company due to numerous GDPR violations regarding the processing of its employees' data (such as permanent geolocation tracking, as well as other violations).

Vodafone Romania S.A.

The National Supervisory Authority for Personal Data Processing (Romanian DPA).

€5,000

Complaints were made regarding the disclosure of several email addresses by the relevant company. The regulator found that Vodafone had not adopted sufficient technical and organizational measures to ensure the confidentiality of processed personal data. As a result, the company was fined for violating Article 32 GDPR.

 
Our recent publications

If you enjoyed these articles, you may also be interested in our recent article on overcoming the challenges associated with using personal data in AI projects, which you can read here.

The team has also recently published a summary of recent developments in cybersecurity laws in both the UK and the EU, with a comparative analysis of the two regimes.