Data Protection update - August 2024

Data Protection update - August 2024

Welcome to the Stephenson Harwood Data Protection update, covering the key developments in data protection and cyber security law from August 2024.

This month, noyb filed nine GDPR complaints against X's plans to utilise user data to train its AI chatbot, the UK Government proposed increasing data protection fees, the EU and China begin talks on cross-border data transfers, the ICO launched its fifth call for evidence on generative AI, the European Commission launched a call for evidence on the first review of the EU-US Data Privacy Framework, and the Greater London Authority published the London Privacy Register.

In cybersecurity news, IBM published a report revealing the average cost of a data breach in 2024, an NHS provider faces a provisional £6 million fine following a cyber-attack in 2022, the Australian government is set to introduce a new cybersecurity law and reports show a sharp rise in cyberattacks on the shipping industry.

In enforcement and civil litigation news, Uber was fined €290 million by the Dutch data protection authority, the High Court allowed a transfer of personal data to Ukraine, Microsoft was found liable for consent-free cookie storage via third party websites and Meta's challenge to the European Data Protection Board's "pay or consent" opinion was published in the EU Official Journal.

We are also pleased to announce a new section of our newsletter. Our US Updates section covers upcoming US state privacy laws, and deadlines for when key provisions will come into force.

Data protection

 
Cybersecurity

 
Enforcement and civil litigation 

 
Data protection

noyb files nine GDPR complaints against X's plans to utilise users' data to train AI chatbot

noyb, Max Schrems' data privacy group, has filed GDPR complaints with nine data protection authorities (each a "DPA") against X for its processing of user data to train its AI chatbot. noyb has filed complaints in: Austria, Belgium, France, Greece, Ireland, Italy, the Netherlands, Spain and Poland.

In May 2024, X began processing users' data to train its AI chatbot, Grok. However, it neither informed users that their data would be used in this way, nor sought their consent.

The UK's Information Commissioner's Office ("ICO") and the Irish Data Protection Commission ("DPC") have begun investigations into X's practices. On 8 August, X agreed to suspend its processing of users' data to train Grok while the DPC's investigation takes place. For further details on the ICO and DPC investigations into X, see this article from our monthly AI update.

noyb also filed complaints with other DPAs due to concerns that the DPC's investigation "was not questioning the legality of this processing itself" and was more focused on how X had begun processing users' data while it was still in a mandatory consultation process with the DPC. noyb expressed concerns over "countless instances [of] inefficient and partial enforcement by the DPC in the past years", prompting the group to take further action to ensure X complies with EU law.

The nine complaints made by noyb focus on X's lack of a valid legal basis under the GDPR for processing its users' data for the chatbot, as well as its failure to comply with other data protection obligations, such as the rules around transparency.

So far, no DPAs have publicly responded to noyb's complaints.

Government proposes increasing ICO data protection fees

The Department of Science, Innovation and Technology ("DSIT") has launched a consultation on increasing the ICO's data protection fees.

DSIT have stated that they believe that the fee increase is necessary in order to provide the ICO with adequate resources for its current and future responsibilities (including the implementation of the proposed Digital Information and Smart Data Bill introduced in the King's Speech). It also wants to ensure that the fees are proportionate, and that regulatory costs are spread out fairly across data controllers.

Subject to the responses to the consultation, the Government aims to implement the fee changes in 2025.

The consultation closes on 26 September 2024 and can be accessed here.

EU and China begin talks on cross-border data transfers

The EU and China have entered into the first discussions under the Cross-Border Data Flow Communication Mechanism (the "Mechanism"). The Mechanism aims to address problems faced and raised by EU companies in China regarding cross-border flows of non-personal data in certain sectors (such as finance, insurance, pharmaceutical, automotive and information and communication technology), as well as to help these companies better understand Chinese data protection laws.

One of the key issues discussed was regulatory uncertainty. In 2022, China adopted the Measures of Security Assessment for Data Export (the "Measures"). Under the Measures, any data categorised as "important data" would be subject to a security assessment prior to any transfers. Many European companies operating in China felt that it was unclear what constituted "important data" and whether any data (including non-personal data) they transferred out of China would be captured by these restrictions.

Further talks are expected to take place in the future to review the progress made under the Mechanism.

ICO launches its fifth call for evidence on generative AI

The ICO has launched its fifth (and final) call for evidence on generative AI. This consultation is focused on the allocation of accountability for data protection compliance across the generative AI supply chain.

Please see our insight here for more information on the consultation.

The call for evidence is open until 18 September 2024 and can be accessed via this link.

Commission launches call for evidence on EU-US Data Privacy Framework

The European Commission (the "Commission") has launched a call for evidence on the EU-US Data Privacy Framework (the "Framework").

The Framework is subject to periodic review by the Commission, with the first review taking within a year to assess whether all parts of the framework are in place and working as intended. For more information on the Framework, see our article here.

The Commission adopted its adequacy decision on the Framework on 10 July 2023 and it came into effect immediately. Under the Framework, US entities must self-certify to import personal data from the EU. Once they are certified, no other safeguards will be required for the transfer.

The call for evidence is open until 6 September 2024. You can give feedback on how you think the EU-US Data Privacy Framework is functioning here.

Greater London Authority publishes the London Privacy Register

As part of the London Emerging Technology Charter (the "Charter"), the Greater London Authority has published a beta version of the London Privacy Register (the "Register").

The Charter provides practical and ethical guidelines for implementing "smart city projects" in London. These projects use data-enabled technology to enhance public services and the public realm across the city.

The Register is an archive of data protection impact assessments ("DPIAs") for "smart city projects that collect personal information in public spaces". For instance, on the register, the public can view the DPIA for the Metropolitan Police's use of live facial recognition and TfL's use of Google Streetview in various Tube stations.

The aim is for these DPIAs to be published on the Register at the same time as the smart city project goes live. This allows the public to understand how their data is being used and the measures being taken to protect it. This is one of the goals of the Charter: to improve transparency for Londoners around products or services that are likely to result in a high risk to their privacy.

The Chief Digital Officer for London has stated that there are plans to expand the scope of the Register so that it also covers DPIAs carried out by local authorities and private landlords (as it currently only covers DPIAs carried out by public services). However, it is not clear when this expansion is expected to take place.

The Register is still in its beta phase, so the Greater London Authority may make further adjustments based on user feedback.

Cybersecurity

IBM report reveals average cost of a data breach in 2024

IBM has released its annual Cost of a Data Breach Report, offering deep insight into the nature of breaches, their causes, and the financial implications for businesses. The report, based on real-world incidents, highlights the factors that can either increase or decrease the costs associated with these breaches. It also offers expert recommendations on how organisations can effectively mitigate risks and enhance their cybersecurity strategies.

Key takeaways from the report include a significant increase in the global average cost of a data breach, which reached $4.88 million in 2024, the highest total ever. This marks a 10% increase over last year. Healthcare remains the most expensive industry for data breaches, with costs averaging $9.77 million per breach. Phishing and stolen or compromised credentials were the most prevalent attack vectors and are amongst the top 4 costliest incident types.

Key recommendations

1. Know your information landscape
The study found that 40% of breaches involved data distributed across multiple environments, such as public clouds, private clouds and on premises. Having incomplete or outdated data inventories can delay breach detection and response, thus raising the cost of a breach. To mitigate this risk, organisations should maintain comprehensive visibility across all data environments, allowing continuous monitoring and protection of data wherever it resides.

2. Strengthen prevention strategies with AI and automation
The widespread use of generative AI models, third-party applications, Internet of Things devices, and SaaS applications is expanding the attack surface. Organisations should adopt AI and automation tools that enhance security prevention in areas like attack surface management, red-teaming, and posture management. Organisations that applied AI and automation saved $2.22 million in breach costs compared to those that didn’t.

3. Take a security-first approach to gen AI adoption
As organisations rapidly adopt generative AI, only 24% of these initiatives are currently secured, posing significant risks to data and models. To mitigate these risks, organisations need to secure generative AI data, models, and usage, and establish AI governance controls. Key security measures include protecting training data from theft and manipulation, using data discovery and classification to identify sensitive data, and implementing controls such as encryption, access management, and compliance monitoring.

4. Level up your cyber response training
75% of the increase in average breach costs was due to the cost of lost business and post-breach response activities. Organisations should rehearse breach responses by participating in cyber range crisis simulation exercises to enhance their ability to handle high- impact attacks. Investing in post-breach response preparedness can help dramatically lower breach costs.

IBM's report underscores the critical importance of proactive cybersecurity measures and the growing complexity of the threat landscape.

NHS software provider faces £6 million fine after cyber attack

The ICO has provisionally decided to fine Advanced Computer Software Group Ltd ("Advanced") £6.09 million following an initial finding that the company failed to protect the personal data of 82,946 individuals, including sensitive data. Advanced provides IT and software services to the NHS and other healthcare providers, managing personal data on their behalf.

The provisional decision stems from a ransomware attack in August 2022, where hackers accessed Advanced's health and care systems through a customer account lacking multi-factor authentication. Personal data, including phone numbers, medical records, and home entry details for home care patients, was exfiltrated during the breach. The incident caused significant disruption to NHS 111 and other critical health services, although no evidence was found that the data was published on the dark web.

The UK Information Commissioner has emphasised the importance of information security, urging all organisations, particularly those handling sensitive health data, to implement robust security measures, including multi-factor authentication and regular system updates. The Commissioner’s findings remain provisional, and no final decision on the fine has been made. Advanced can respond before any penalty is confirmed.

Cyber ransom payments will need to be disclosed under new Australian law

The Australian government is set to introduce the Cyber Security Act (the "Act") which will require businesses to disclose ransomware payments, marking a significant shift in how cyberattacks are handled. The proposed Act, expected to be brought before parliament soon, aims to bring transparency to the growing practice of secret ransom payments, which are believed to fuel further cybercrime.

Under the Act, Australian businesses and government entities will be obliged to report any payments made to hackers or face fines. This move is intended to map the scale of the problem and expose the billions of dollars being paid to cybercriminals. The Act includes a "Limited Use Provision" that ensures that reported information will not be widely shared, except in narrow circumstances. This measure is designed to foster cooperation without exposing businesses to additional regulatory scrutiny. However, regulators would still be allowed to investigate and prosecute these companies.

However, the Act has sparked concerns that the new reporting requirements and potential $15,000 fines for failing to disclose a payment could threaten small businesses. Business groups are advocating for the mandatory reporting obligation to apply only to businesses with higher turnover thresholds, arguing that smaller organisations may not have the resources to comply.

Sharp rise in cyberattacks in shipping industry

The FT has reported that the shipping industry is grappling with a significant surge in cyberattacks, as geopolitical disputes prompt state-linked hackers to target trade flows. Reports reviewed by researchers at the Netherlands’ NHL Stenden University of Applied Sciences show that the number of maritime cybersecurity incidents has increased, with 64 recorded in 2023 compared to none in 2003. Moreover, over 80% of identified incidents since 2001 with a known attacker originated in Russia, China, North Korea or Iran.

This alarming trend is attributed to the sector's growing reliance on digital technologies. The significance of these cyberattacks lies in their ability to severely disrupt global trade and logistics. When a major port or shipping company is targeted, it affects supply chains worldwide, causing delays, rerouting of shipments, and financial losses. For instance, the 2020 attack on Iran's Rajaee Port, which handles nearly half of the country’s foreign trade, created significant bottlenecks, affecting the flow of goods in and out of the region.

The shipping industry faces significant challenges in combating the rise of cyberattacks, exacerbated by historically low IT spending in the sector. There is difficulty in finding experts who possess both maritime and cybersecurity knowledge—a highly specialised and limited talent pool. This shortage leaves shipowners vulnerable as they struggle to strengthen their defences against increasingly sophisticated threats.

Enforcement and civil litigation

Uber fined €290 million by the Dutch DPA for transfers of data to US in reliance on derogations

The Dutch DPA, the Autoriteit Persoonsgegevens (the "AP"), has fined Uber €290 million for failing to adequately protect its drivers' personal data when transferring data from Europe to the US. The failures related to the period during which the EU-US Privacy Shield was invalidated, and the ability to rely on standard contractual clauses was limited where the importer was subject to GDPR.

Uber transferred the personal data of its European drivers (including data such identity documents, location data, health data and criminal data) from Uber BV in the Netherlands to its parent (and joint controller) Uber Technologies in the US. This data was sent to the US parent company for central storage and effective management purposes. After complaints by French Uber drivers to a human-rights interest group were forwarded to the AP, the DPA began investigating.

Uber revised its approach to overseas data transfers in the wake of the invalidation of the EU-US Privacy Shield in the judgment known as “Schrems II” in 2020. In June 2021, the Commission published standard contractual clauses (“SCCs”), which had been updated to reflect the Schrems II ruling and the GDPR. The updated SCCs stated that they should not be used where the importer was directly subject to GDPR through its extra-territorial effect. It was expected that a lighter touch set of SCCs would instead be published for this purpose, although they have not yet appeared.

Uber began relying on the successor to the Privacy Shield, the EU-US Data Privacy Framework, from 27 November 2023. Before that, between August 2021 and November 2023, Uber did not rely on the SCCs to make transfers to its US joint controller. Instead, it relied on the derogations under Article 49(1)(b) and (c) of the GDPR to make these transfers. These derogations broadly apply where the transfer is necessary for a contract with the data subject, or a contract in the interest of the data subject.

The AP found that Uber did not have effective safeguards in place for these transfers and that it was therefore in breach of Article 44 of the GDPR.

It is difficult to know what Uber could have done in order to safeguard these transfers lawfully during a period of tremendous upheaval for US transfers. The instructions accompanying the SCCs, saying that they were not to be used where the importer is subject to GDPR, and the lack of an alternative to the Privacy Shield until mid-2023, left a legal lacuna. This meant that it would have been difficult to find any safeguard for US transfers that worked in these circumstances. One alternative may have been for Uber to ignore the instructions that the SCCs should not be used where the importer is subject to GDPR itself and simply rely on them anyway during this period, but this seems less than ideal.

Uber is planning to appeal the fine, as it argued that these transfers took place at a time of "immense [regulatory] uncertainty between the EU and the US".

While practitioners may have thought that the turbulence of data transfers was over now that the Data Privacy Framework is now in place, this decision shows there may yet be more transfers fallout to come.

English court allows transfer of personal data to country without UK GDPR adequacy status

The High Court has granted a bank permission to transfer personal data to the Ukrainian Bureau of Economic Security (the "BES"). The bank sought to disclose two schedules attached to a particulars of claim for a Ukrainian criminal trial concerning the alleged embezzlement of the bank's funds. However, these schedules contained personal data of the defendants to the criminal proceedings. Given that there is no adequacy decision between England and Ukraine, the bank needed to seek the court's permission to send the documents to BES.

The High Court's decision focused on whether a relevant GDPR derogation allowed for the data transfer. Specifically, the court considered Article 49(1)(e), which permits the transfer of personal data to a jurisdiction without an adequacy decision if "the transfer is necessary for the establishment, exercise, or defence of legal claims."

The Court ruled in favour of the bank, finding that the derogation applied in this case. The Court also took into account the fact that the personal data was already in the public domain when making their decision.

The judgment heavily relied on the ICO's guidance on the application of the derogations, in particular that the derogation "must be a targeted and proportionate way of achieving a specific purpose".

You can read the judgment in full here.

Microsoft found liable for consent-free cookie storage via third-party websites

The Higher Regional Court of Frankfurt am Main has ruled that a Microsoft subsidiary ("Microsoft") is liable for storing cookies on users' devices without their consent, even if that consent was supposed to be obtained by third-party website operators. This decision follows an appeal by a plaintiff who argued that cookies from the "Microsoft Advertising" service were placed on her device without her permission while visiting third-party websites.

Through the "Microsoft Advertising" service, Microsoft provides website operators with tools to display ads and track user activities via cookies. These cookies help serve targeted ads based on user behaviour. Initially, the Regional Court of Frankfurt am Main dismissed the plaintiff’s request for an interim injunction, arguing that the plaintiff could block cookies through her browser settings, while an injunction would impose significant burdens on Microsoft, requiring extensive changes to its processes. Thus, the balance of interests is therefore in favour of Microsoft.

On appeal, the Higher Regional Court overturned this decision and granted the injunction, prohibiting Microsoft from using cookies on the plaintiff’s devices without her consent. This decision is final and not subject to appeal. The court found that Microsoft is directly involved in the storage of these cookies and cannot shift responsibility solely to the website operators, even if their terms and conditions state that the operators must obtain user consent. The court emphasised that the law prohibits any access to a user’s networked devices without their explicit consent, holding Microsoft accountable as it facilitated the storage and reading of cookies through the code it provided to websites.

Meta's challenge to the EDPB's "pay or consent" opinion published in the EU Official Journal

Meta's challenge to the European Data Protection Board's ("EDPB") "pay or consent" opinion, published in April 2024, has now been published in the EU Official Journal.

For further details on Meta's challenge, please see our insight here. Additionally, for further details on the EDPB's "pay or consent" opinion, please see our insight here.

Round-up of enforcement actions

Company

Authority

Fine

Comment

Credit Agricole Auto Bank

Italian DPA

€1 million

Fined for unlawful processing of personal and income data of customers who requested financing for long-term car rentals.

GSMA Limited

Spanish DPA

€600,000

The company was fined for forcing employees to provide sensitive health information (e.g. Covid vaccination certificates)

American Heart of Poland

Polish DPA

€330,000

Fined for failing to take adequate security measures, leading to a data breach which impacted 21,000 people.

Various companies

Turkish DPA

£14 million (aggregate fine)

The Turkish DPA did a sweep of data controllers who had an obligation to register with it and issued this aggregate fine to various data controllers who had failed to do so.

This included both Turkish data controllers, and those located outside of Turkey.

AliExpress

South Korean DPA

1.9 billion Korean won (around $1.43 million)

AliExpress was fined as they hadn't notified customers that their data was being transferred outside of South Korea.

 

The DPA also found that it was difficult for customers to exercise their data subject rights (e.g. account deletion).

 
US updates

July was a busy month for US state privacy laws, with new laws coming into effect in Texas and Oregon, and new requirements under existing laws coming into force in Colorado, New Hampshire, Tennessee and Oregon.

State

Date

Law

Texas

1 July

Texas Data Privacy and Security Act came into effect.

Oregon

1 July

Oregon Consumer Privacy Act came into force.

Colorado

1 July

Under the Colorado Privacy Act, organisations who are data controllers under this act must offer consumers a universal opt-out mechanism for the sale of their personal data/use of their data for targeted marketing.

New Hampshire, Tennessee, Oregon

1 July

DPIA requirements will apply to processing activities created or generated after this date.