Data Protection update - September 2024

Data Protection update - September 2024

Welcome to the Stephenson Harwood Data Protection update, covering the key developments in data protection and cyber security law from September 2024.

This month, the European Commission announced plans to consult on a new set of SCCs for third-country data importers directly subject to the GDPR, Meta is going ahead with planned "data scraping" activities in the UK after ICO guidance, the ICO has reprimanded Sky Betting and Gaming for cookie non-compliance as part of its wider cookies "crackdown", and the European Commission published a series of clarifying FAQs on the EU Data Act.

In cybersecurity news, TfL has been handling a prolonged cybersecurity incident throughout September, which continues to cause disruption to some of its online services. The latest updates indicate that some customers' personal information and banking details may have been maliciously accessed.

In enforcement and civil litigation news, the Dutch DPA has fined Clearview AI for continued GDPR breaches, with the additional threat of further penalties if these breaches are not stopped and the potential that Clearview's directors may be personally sanctioned; the Seoul High Court has upheld a 2020 ruling against Meta by the South Korean data protection authority, with Meta now appealing; and the Ofcom Chief Executive has set out how Ofcom proposes to enforce the requirements of the UK's Online Safety Act on "small but risky" online platforms.

Data protection

Cybersecurity

Enforcement and civil litigation

Data protection

European Commission trails planned consultation on new SCCs for third-country controllers and processors subject to GDPR

The European Commission has announced an upcoming consultation on developing a new set of standard contractual clauses ("SCCs") for international data transfers to data importers situated in third countries that do not have a Commission adequacy decision in place. This new set of SCCs will be drafted specifically to cover transfers to third-country importers who are directly subject to the GDPR ("directly subject importers") – as opposed to the existing set of SCCs, which technically should be applied only to transfers to third-country importers not subject to the GDPR.

Data importers may be directly subject to the GDPR, notwithstanding their third-country status, by virtue of the extraterritorial application of Article 3(2) of the GDPR. This says that where processing of personal data of individuals in the EU is carried out by a controller or processor not situated in the EU, the GDPR will nevertheless apply if either: (a) the processing relates to offering goods or services to people in the EU; or (b) the processing relates to monitoring of EU individuals' behaviour (so far as that behaviour takes place in the EU).

The Commission has previously indicated (in responses to frequently asked questions, published in May 2022) that exporters (whether within or outside the EEA) can use the existing, 2021 set of SCCs in any dealings they may have with non-EEA importers that are not directly subject to the GDPR.

However, the Commission went on to confirm the statement in the recitals to the 2021 SCCs, that they should not be used by exporters when contracting with directly subject importers, because the obligations they impose would "duplicate and, in part, deviate from the obligations that follow directly from the GDPR".

It has been far from clear, in the intervening time, what measures entities seeking to export data to directly subject importers should put in place, if not the 2021 SCCs. In a recent decision, which highlighted this issue, Uber was fined €290 million by the Dutch Data Protection Authority, in a case which appeared to put Uber between a very large rock and a very hard place.

We covered this decision in our August 2024 Data Protection Bulletin, which you can read here. In short, Uber was fined because it did not use any SCCs in intragroup transfer agreements between its European subsidiary and the US parent company, despite the fact that its US parent company was, by virtue of Art. 3(2) of the GDPR, a directly subject importer and therefore, per the Commission's own guidance, its intragroup transfer agreements should not have included the 2021 SCCs.

Although potentially coming too late for Uber, these new SCCs will address this invidious position, giving EEA entities clear requirements for what standard clauses must be contained in data transfer agreements that they enter into with directly subject importers. The consultation is anticipated to open in the fourth quarter of 2024, with the new SCCs planned to have been adopted by the Commission and to be ready for use by Q2 2025.

Meta plans to resume data-scraping exercise in the UK for AI model training, following ICO clarification

Clarification provided by the ICO

As a result of receiving "clarity and certainty" from the UK Information Commissioner's Office ("ICO"), Meta is resuming plans in the UK to scrape and process Facebook and Instagram user data to train artificial intelligence models.

The original plan was for these Meta-owned and operated platforms to use public posts by UK users, including photos and videos, to train its multi-modal generative AI models. Meta has now expanded these plans so that comments on users' posts will also be used in the training exercise. It will also continue to use "legitimate interests" as its legal basis for the processing of personal data entailed by this data scraping.

Clarification from the ICO has also led to Meta making changes regarding the transparency with which it will communicate its plans to users; as well as simplifying the process for users to opt-out, with users no longer having to type out their reasons for opting-out of the project.

The data scraping in practice

Meta has stated that it will only collect data from users over 18 (based on the date of birth given when users signed up to a platform, rather than on age verification). Users must also be a resident of the UK, which will be determined based on factors such as IP addresses and interaction history.

The collection of data will be retrospective, with users who have not opted out having all the public data they have shared since turning age 18 fed into the AI training model. Data will not be collected from accounts that are private, or from messages sent between users.

The collection is due to begin from October this year.

Contrast with the EU

While Meta is resuming its data scraping plan in the UK, the same project is paused in the EU and European Economic Area and is currently under investigation by the Irish Data Protection Commission ("Irish DPC"). The Irish DPC has also asked the European Data Protection Board to provide an opinion on the processing of personal data to train AI models.

Commenting on the different approaches taken by the EU and the UK, the head of EU affairs at Meta, Marco Pancini, said "this clear stance contrasts sharply with EU regulators, who continue to struggle with reaching a consensus on the application of the law…The EU is at risk of lagging behind due to its inconsistent and complex regulations".

No requirement for controllers to disclose algorithms in DSAR responses, according to ECJ Advocate-General

In an opinion delivered on 12 September, Advocate-General Richard de la Tour of the Court of Justice of the European Union (CJEU) concluded that while data subjects have the right to access information on the method and criteria used for automated decision-making, controllers are not legally obligated to disclose technical information to data subjects who are not in a position to understand it.

This opinion stems from a 2022 case brought before the administrative court of Vienna, where a data subject sought information underlying the automated assessment of her data by the credit rating agency Dun & Bradstreet.

De la Tour clarified that while the GDPR ensures transparency, it does not require data controllers to disclose the full algorithm. Instead, they are obligated to provide "meaningful information about the logic involved" in algorithmic decision making under article 15 of the GDPR.

The controller is required to present the information in a clear, straightforward manner that is concise, easy to access, and simple to understand, using clear and plain language. De la Tour also opined that, if necessary, the data should be contextualised so that individuals can ensure there is "an objectively verifiable consistency and causal link" between the method used by the controllers and the results of the automated decisions.

However, de la Tour stressed that if the information is likely to breach the rights of others (for example if it contains personal data of third parties or trade secrets) then it must be disclosed to GDPR regulators or the courts. They will then consider the competing interests, and determine whether to grant access requests.

This opinion highlights the balance between protecting individuals’ rights to understand how decisions are made about them and safeguarding the privacy rights of third parties and businesses' intellectual property. While Advocate-General opinions are non-binding, they are nonetheless influential, and the CJEU often follows them in its rulings.

Commission publishes clarifying FAQs on the EU Data Act

A year out from the EU Data Act becoming applicable to Member States in September 2025, the European Commission has published a series of FAQs and clarifying responses on various aspects of the Act, seeking to create greater certainty for organisations currently preparing for the new law to take effect.

We are currently preparing a piece looking into the key clarifications provided by these FAQs. Once this is published, it will be available on our Data Protection Hub via our main page on the Data Act, which you can find here. The team has also produced a broader series of articles on the key features of the Data Act, which can be found here (in respect of data access rights), here (on third-party data sharing obligations) and here (covering switching, interoperability and prevention of unauthorised access to non-personal data).

Cyber security

TfL faces "cyber security incident" throughout September, impacts still being felt

On 1 September, Transport for London ("TfL") reported that it is dealing with an "ongoing cyber security incident". The situation is still developing and TfL, alongside the National Cyber Security Centre and the National Crime Agency, is currently investigating the incident. It has also been reported that a 17-year-old from Leicester was arrested two weeks ago in connection with the incident.

While the incident has so far appears to have had a limited impact on TfL customers, the body has confirmed that certain customer data has been accessed. The data accessed includes some customer names, contact details, email addresses and home addresses. TfL has also reported that some Oyster card refund data may have been accessed, which possibly includes the bank account numbers and sort codes for around 5,000 customers.

To protect services and secure their systems, TfL said that it is taking "proactive measures" such as temporarily suspending applications for new Oyster photocards and undergoing an all-staff IT identity check. Staff have also been asked to work at home if possible. 

TfL is maintaining a dedicated page on its website with updates on the incident and the measures being taken in response, which you can view here.

Meta fined €91 million by Irish DPC over unencrypted password storage

The Irish Data Protection Commission ("DPC") has fined Facebook parent company Meta €91 million over security failings dating back to 2019.

During a routine security review, Facebook discovered in January that year that it had been storing the passwords of hundreds of millions of users in unencrypted plain-text files, which could have been accessed and read by anyone who might have gained access to the files. The affected individuals were predominantly users of its pared-back "Facebook Lite" offering, but also included users of the main Facebook platform, as well as Instagram users.

The social media giant self-reported to the DPC, which levied the fine earlier this month, after submitting a draft decision to other European data protection authorities earlier this year and receiving no objections in response.

Facebook has stated that there is "no evidence" that the unprotected password data had actually been compromised or maliciously used, but the DPC noted the potential seriousness of the harms that could have arisen if they had been. It pointed out that these passwords were "particularly sensitive, as they would enable access to users' social media accounts", and that this had been a breach of "widely accepted" practice that passwords should not be stored in plain text format.

This is not the first time the DPC has fined Meta for failings in data protection and security – for example, it levied a fine of €1.2 bilion, the largest fine to date issued under the GDPR, on Meta in May of last year in connection with mishandling of personal data transfers from the EU to the US. However, this fine is notable in the lack of objections from other European authorities through the "one stop shop" mechanism.

Enforcement and civil litigation

ICO issues reprimand to Sky Betting and Gaming for placing cookies without user consent

The UK Information Commissioner's Office has reprimanded Sky Betting and Gaming for placing non-essential marketing cookies on users' devices, without users' consent and before the "cookie banner" was displayed to users on visiting the website. This forms part of broader work the ICO is conducting on cookie compliance amongst UK websites, signalling somewhat of a "crackdown".

For a fuller summary of the nature of the noncompliance for which Sky Betting and Gaming was sanctioned and the ICO's other work on cookies, see our insight here.

Dutch DPA fines Clearview AI and contemplates further sanctions on individual Clearview directors and companies using Clearview services

The Dutch DPA (the Autoriteit Persoonsgegevens, or "AP") has imposed a fine of €30.5 million on Clearview AI for illegally gathering and using personal data, including billions of photos of faces, for use in its facial recognition technology.

The AP's decision states that Clearview has engaged in practices involving automated web-scraping for images of faces, conversion of these images into a "unique biometric code" for each individual face, and providing a service that allowed Clearview customers to provide an image of a face and to have this image compared and matched with other occurrences of that individual's face in Clearview's database.

This processing was conducted without any of the individuals whose faces appeared in the database having had any knowledge that this processing was occurring – nor did Clearview seek or obtain consent from any of these individuals. The AP held that this constituted a "serious" violation of the GDPR and that Clearview's services are "illegal".

Notably, the AP's decision points out that Clearview did not cease the activities that constituted GDPR violations even after the AP's investigation had taken place. In measures that point to the seriousness of the breaches that the AP considers Clearview to have committed, it has therefore ordered Clearview to cease the relevant activities, on pain of being hit with a potential further €5.1 million penalty on top of the already-imposed fine.

The AP has also warned that it is investigating the possibility of holding Clearview managers personally liable for what it has said are serious and repeated GDPR violations, ongoing in spite of previous fines imposed by other European data protection authorities. Going even further, the AP has stated that any Clearview service user is also "breaking the law", and that any Dutch organisation that uses Clearview can expect "hefty fines" as a consequence.

Ofcom sets out targeted approach for regulating and supervising "small but risky" UK online platforms under new legislation

Dame Melanie Dawes, the Chief Executive of the UK's media regulator Ofcom, has outlined – in a letter to the Secretary of State for Science, Innovation and Technology – the regulator's plans to take action against "small but risky" online services in the UK.

Ofcom was recently handed an expanded set of powers, under the Online Safety Act 2023 (the "Act"), to implement and enforce the Act's provisions and to take on a role as the independent regulator for "Online Safety". It is pursuing a phased approach to bringing in guidance and codes of practice for different types of online service, with the duties under these codes anticipated to come into effect in early 2025.

"Small but risky" is a term that has been used to describe online platforms which, whilst not reaching very many users or having a widespread online presence, nevertheless pose, or may pose, significant risk to UK citizens – for example, because they host particularly risky or harmful types of content.

Using its new powers of enforcement and supervision, Ofcom's planned approach – as described in the Chief Executive's letter – will be for "targeted oversight". It will pursue direct engagement with, and supervision of, selected online services that pose particular risks owing either to their size or – as in the case of these so-called "small but risky" platforms – because of the nature of the service provided (for example, file-sharing services which provide a potential route for individuals seeking to share illegal and harmful material).

The draft codes of practice published by Ofcom contemplate that "services that pose the most risk should take the greatest precautions". In other words, services that pose a high risk will need to adopt increased protections and safeguards irrespective of their size.

Ofcom will pursue a "bespoke" approach, in some cases seeking engagement to drive improvements, while in other cases – in particular, if the platform in question "knowingly or explicitly welcome[s] users whose behaviour is likely to be illegal or harmful", or if it fails to engage with regulatory supervision and there is evidence of non-compliance – Ofcom considers that "enforcement may be the most effective route".

South Korean court upholds sanctions on Meta over Cambridge Analytica-related personal data sharing

Meta Platforms, parent company of Facebook, has suffered a defeat in a long-running battle against a decision originally handed down against it in 2020 by the South Korean Personal Information Protection Commission ("PIPC"). Following the ruling against it by the Seoul High Court, Meta has appealed the decision to the South Korean Supreme Court.

The original decision, comprising a fine of 6.7bn South Korean won (circa $5m) and a series of corrective orders, was imposed because the PIPC held that Facebook had been sharing personal information of the friends of Facebook users, amounting to over 3.3m South Korean individuals (themselves also users of Facebook), with third-party entities without having obtained proper consent from those individuals.

This sharing occurred upon the original user selecting the "Facebook Login" option provided by many third-party entities as a means for logging into their online services or IT applications. The effect of a user selecting this feature was that data held by Facebook on that individual, including their friends' information, would be shared with that third party.

Among those third-party services with which users' personal data was shared was This Is Your Digital Life. This third-party service was one of the means by which Cambridge Analytica obtained personal data, which it used in targeting political advertising and political campaigning generally.

Facebook had argued before the Seoul High Court (hearing the appeal) that this meant that it was the user, not Facebook, that had in effect made the decision to share the data with the third party, and that Facebook could not have proceeded with such sharing in the absence of the user's positive action in selecting the Facebook Login option.

However, the court ruled against Meta, with the full details of the reasoning behind the decision expected shortly.

Round-up of enforcement actions

Company

Authority

Fine

Comment

Apoteket AB and Apohem AB

Swedish DPA

€3.2 million and €698,000 respectively

The Swedish DPA found that both companies had breached Art. 32(1) of GDPR by failing to have in place appropriate technical and organisational measures to ensure sufficient security for their customers' personal data, when placing the Meta pixel on their sites and subsequently transferring sensitive personal data to Meta.

Uniqlo Europe, Ltd, Sucursal En España

Spanish DPA

€270,000

In response to a former employee's request for access to their payroll information, the named company (as data controller) sent the former employee an email with an attachment containing their payroll information alongside that of 446 other workers.

Unspecified company

Thai DPA

7m Thai Baht (approximately €200,000)

In the first administrative fine issued under the Thai Personal Data Protection Act, the Thai DPA issued the maximum possible fine to an unnamed online goods-trading company which had failed to appoint a data protection officer; failed to implement appropriate security measures, leading to a leak of personal data; and failed to promptly take corrective action and notify the authorities as soon as it became aware of the breach.

Hera Comm S.p.A.

Italian DPA

€5 million

The Italian DPA highlighted "serious violations" in the company's processing of customers' personal data. A number of door-to-door salesmen used customer personal data, to which they had access owing the company's inadequate technical and organisational measures, to fraudulently commence contracts for gas and electricity supplies without the actual knowledge of those customers. The company's monitoring system was also inadequate to the extent it did not identify these fraudulent transactions during the ostensible customer 'onboarding' process.

Various companies

Croatian DPA

€270,000 (aggregate fine)

The Croatian DPA imposed 12 fines, in an aggregate value of €270,000, on various companies for GDPR non-compliance, including a €190,000 fine on a hospital for losing sensitive patient data, failing to implement technical safeguards and failing to report the incident to the DPA within the required time period; and two hotels fined a combined €45,000 for unauthorised use of cookies.

Key US updates

State

Date

Law

California

28 September

SB-1223 "Consumer privacy: sensitive personal information: neural data", having passed the California legislature, was approved and signed by the Governor.

California

28 September

AB-1008 "California Consumer Privacy Act of 2018: personal information", having passed the California legislature, was approved and signed by the Governor.

Montana

1 October

Montana Consumer Data Privacy Act went into effect.