Data Protection update – November 2023
Welcome to the Stephenson Harwood Data Protection bulletin, covering the key developments in data protection law from November 2023.
This month, the European Parliament approved the EU Data Act, the Data Protection and Digital Information (No. 2) Bill progressed through the UK Parliament, and the ICO and EDPB have issued guidance and warnings on the use of cookies.
In AI news, the Biden Administration released an Executive Order on AI, promoting AI regulation in the US and internationally; the EU's AI Act reached the final stages of the legislative process and G7 states published an AI code of conduct aiming to promote safe, secure and trustworthy AI development worldwide.
Elsewhere this month, the French data protection regulator issued fines totalling €97,000 and the Industrial and Commercial Bank of China suffered a significant ransomware attack which affected market trading.
Data protection
- The King's Speech and progress on UK data protection reforms
- European Parliament approves EU Data Act
- EPDB issues draft guidelines on technical scope of Article 5(3) of ePrivacy Directive
- ICO publishes draft UK BCR Addendum
AI
- Biden announces Executive Order on AI
- G7 publishes code of conduct on artificial intelligence
- EU AI Act negotiations come to a head over foundation model regulations
Cyber security
- NCSC publishes guidelines for secure AI system development
- DSIT publishes updated code of practice for app store operators and app developers
- ICBC suffers from ransomware attack that disrupted the US Treasury market
- Ransomware group reported their own victim for failing to disclose cyber attack
Enforcement and civil litigation
- Meta faces EU ban on relying on legitimate interests and contract to process personal data for behavioural advertising. ban on targeted advertising
- CNIL imposes multiple fines under simplified procedure
- EasyJet ICO investigation dropped
- Ireland finds Airbnb in violation of legitimate interests processing, data minimisation and storage limitation
- Round-up of enforcement actions
Data protection
The King's Speech and progress on UK data protection reforms
On Tuesday 7 November 2023, the Data Protection and Digital Information (No. 2) Bill ("DPDI"), the Digital Markets, Competition and Consumers Bill ("DMCCB") and the Investigatory Powers Amendment Bill ("IPAB") were among the twenty bills announced in the King's Speech, opening the new session of the UK Parliament.
In July 2022 we wrote here about the introduction of the first version of the DPDI Bill to Parliament, which aimed to update and simplify the UK's data protection framework. The second version of the DPDI Bill made some additional changes, which we wrote about in our insight here.
On 23 November 2023, the government announced amendments to the DPDI Bill, billed as a "raft of common-sense changes". Many of the amendments (the details of which can be found here) were minor and technical in nature, but some were more consequential. Key amendments include: clarifying that controllers only need to conduct reasonable and proportionate searches for data subject access requests; a proposed "data preservation process" requiring social media companies to retain any relevant personal data related to a child that died through suicide for use in subsequent investigations or inquests; facilitating the use of biometric data for the purpose of strengthening national security; introducing powers for the Department for Work and Pensions to carry out checks on bank accounts held by benefit claimants in order to tackle benefit fraud; and enhancing the ICO's governance arrangements to ensure the ICO's independence, increase its transparency, and improve its accountability to Parliament. This includes the requirement to consider, but not be bound by, a government statement of strategic priorities approved by Parliament. There are also limited powers for the Secretary of State ("SoS") to approve statutory codes of practice, and a requirement for the SoS to be transparent when making a decision to refuse. Final approval of statutory codes of practice remain with Parliament. The government has stressed that the amendments should not impact the EU's adequacy decisions in respect of the UK. From the ICO's perspective, the measures to enhance its governance arrangements should assist in maintaining the UK’s adequacy status from the EU .
On 29 November 2023, the above amendments were approved for inclusion in the DPDI Bill, which moved in its amended form to the next stage of its progress through the legislative process (the report stage in the House of Commons).
Some of the other Bills in the King's Speech are also relevant to UK data protection law:
- The DMCCB aims to enhance outcomes for consumers and businesses by fostering innovation and tackling the underlying factors contributing to competition challenges in digital markets. For example, the courts will have the authority to levy financial fines on traders who violate consumer laws. However, the European Commission might think that a higher level of power in terms of enforcement is going to be held by public bodies rather than itself.
- The UK government has said that the IPAB will implement changes to “improve the intelligence services' ability to respond with greater agility and speed to existing and emerging threats to national security”. However, the changes proposed could have an impact on the UK's adequacy decisions from the EU. A balance will need to be struck between public authorities protecting the public whilst simultaneously not disproportionately accessing or using personal data. For example, the trade group TechUK warned that, under the changes in the IPAB, companies may have to comply with warrants issued by the Home Office and hand over user data, even if a review as to whether the request is appropriate is ongoing.
Finally, the UK government's Department for Science, Innovation and Technology ("DSIT") has published The Data Protection (Fundamental Rights and Freedoms) (Amendment) Regulations 2023 (the "SI") which will amend references to "fundamental rights and freedoms" in the UK GDPR and Data Protection Act 2018, with the intention of rights being recognised under UK law as opposed to retained EU law rights. The Retained EU Law (Revocation and Reform) Act 2023 means that these retained EU law rights will not be recognised in UK law after the end of December 2023 and therefore the SI ensures the definition of "fundamental rights and freedoms" is amended and correct going forward.
European Parliament approves EU Data Act
The European Parliament adopted the EU Data Act ("Data Act") on Thursday 9 November 2023 after receiving majority support from MEPs. The aim of the adopted text was set out in a press release announcing the proposal of the Data Act in February 2023. The European Commission explained that the Data Act "will ensure fairness in the digital environment, stimulate a competitive data market, open opportunities for data-driven innovation and make data more accessible for all". See our insight for more information on the key provisions of the Data Act and here for more information on proposed amendments to the Data Act as it was negotiated.
Interestingly, the controversy over how one particular provision of the Data Act impacts commercial smart contracts looks set to continue as this provision has remained in the adopted text. Article 30 of the Data Act details essential requirements for smart contracts in relation to data sharing agreements, particularly that smart contracts are required to "include internal functions which can reset or instruct the contract to stop or interrupt the operation to avoid future (accidental) executions". Concerned blockchain and cryptocurrency firms including Stellar and Polygon voiced their concerns in June in an open letter. The key concern is that the Data Act will cut across the typically unchangeable nature of smart contracts commonly used in blockchains that serve as the foundation for decentralised finance.
The Data Act received formal approval from the European Council on 27 November 2023. It is likely that the act will apply in the EU from Autumn 2025.
EPDB issues draft guidelines on technical scope of Article 5(3) of ePrivacy Directive
On 16 November 2023, the European Data Protection Board ("EDPB") issued draft Guidelines 2/2023 on Technical Scope of Art. 5(3) of ePrivacy Directive ("Guidelines"). Article 5(3) of the ePrivacy Directive requires consent before storing or accessing information on an end user's device through cookies or similar technologies. These Guidelines are designed to shed light on how these rules on cookies apply to emerging tracking techniques, such as contextual advertising.
Previously, the ICO has issued guidance that it should be just as easy for users to "Reject All" advertising cookies as it is to "Accept All". It is important that when users "Reject All" tracking, non-essential trackers are not utilised. Further, on 21 November 2023, the ICO made a statement warning some of the UK's top websites that they face enforcement action if they do not make changes to their cookie practice to comply with data protection law.
The Guidelines identify and explain four key elements of applicability of the cookie rule in particular on definitions and scope. Key points to note include:
- 'Information' – the cookie obligations apply to both personal and non-personal data stored or accessed on an end user's device.
- 'Terminal equipment of a subscriber or user' – devices that are solely used for the relaying of communications without modifying the information will not be considered terminal equipment under the cookie rules. Terminal equipment refers to the device on which a cookie is placed: this could be a computer or phone but would not include devices that do not have the ability to store information.
- 'Electronic communications network' – means, for the purposes of the cookie rules, any network system that allows transmission of electronic signals between its nodes, regardless of the equipment and protocols used. This could be a wired or wireless network, for example, a mobile phone network.
- 'Gaining access or storage' – if an entity actively takes steps to gain access to information stored in the terminal equipment, the cookie rules will apply. The notion of 'gaining access', for example through tracking cookies, is independent from the notion of 'storing information', for example through browser storage cookies. However, both gaining access and storing information are governed by the cookie requirements.
The Guidelines also clarify that the obligations in relation to cookies and similar technologies also apply to use cases such as the use of URL and pixel tracking, local processing, tracking based on IP only, intermittent and mediated IoT reporting and unique identifiers. In combination with the updates in the UK cookies landscape explored above, this could spell significant change for how the adtech industry drives analytics and engagement.
The Guidelines are out for public consultation until 28 December 2023 and responses can be submitted here.
ICO publishes draft UK BCR Addendum
On 5 October 2023, the ICO published a draft Addendum and guidance documents to streamline the process for UK Binding Corporate Rules approvals. One month later, the ICO adopted the new approach.
The UK's new approach to Binding Corporate Rules under the UK GDPR ("UK BCR") simplifies and speeds up the UK BCR approval process. UK BCRs are one of the safeguards available under the UK GDPR for making overseas data transfers. They consist of data protection policies and commitments for making restricted data transfers between members of an international corporate group,
At present, organisations using binding corporate rules under the EU GDPR ("EU BCR") must create separate UK BCR documents for ICO approval.
The UK BCR Addendum process aims to reduce the burden on organisations, enabling UK BCRs to come into effect by:
- Including the group's approved EU BCRs as a starting point;
- Adding to them a standard UK BCR Addendum, to allow EU BCRs to cover UK restricted transfers too (therefore eliminating the need to produce specific UK BCR documents); and
- Including a UK BCR summary, which includes information for data subjects, such as the types of data being transferred, the types of processing and recipient countries.
The draft UK BCR Addendum can be used either: (i) in a standard form with alterations or (ii) as a template with amendments, which is then subject to the ICO's review. In both cases, approval by the ICO is still required; however, the timing for approvals is expected to be much shorter and achievable within a few weeks, as opposed to the current standard timeframe of 18 months.
It is expected that the ICO will publish the final UK BCR Addendum and guidance by the end of this year.
AI
Biden announces Executive Order on AI
On 30 October 2023, the White House issued an Executive Order on AI, encouraging the "safe, secure and trustworthy development and use of AI" (the "Order"). The Order contains a policy roadmap that encourages the responsible and effective use of AI and sets out disclosure requirements and industry-wide obligations for AI systems. The Executive Order focuses on eight key themes: AI safety and security, privacy, equity and civil rights, consumer benefits, workers, innovation and competition, global cooperation on AI, and the US government's use of AI. For more detail on the key takeaways, please see our summary of the Order.
The Order was a clear signal ahead of the UK's AI Safety Summit in early November 2023 that the US will continue to promote the regulation of AI, both internally and internationally. Although the Order reflects Biden's intent, it relies on the coordination and enforcement of US agencies, multinational governments and companies, with no binding regulations. As just one of a patchwork of global and US regulations, the Order's impact remains to be seen. For further insights, please see our deep dive into AI regulations from an international perspective.
G7 publishes code of conduct on artificial intelligence
As part of the Hiroshima AI process, the G7's project in support of cooperation over the development of responsible AI tools and best practices, some of the world's leading economies agreed in September 2023 to commit to work together on a code of conduct on generative AI and other forms of technology.
Subsequently on 30 October 2023, the G7 states published an 11-point set of International Guiding Principles, with the aim to "promote safe, secure, and trustworthy AI worldwide".
In these principles, the G7 recognised the risks of AI and the requirement to manage them to protect individuals' safety and privacy. The code of conduct aims to provide best practices for AI development and encourages developers to commit to the practices outlined in the code, although a list of signatories from AI organisations has yet to be released.
The code of conduct would require organisations to take steps including to:
- Take appropriate measures throughout the development of advanced AI systems to identify, evaluate, and mitigate risks across the AI lifecycle.
- Identify and mitigate vulnerabilities, and, where appropriate, incidents and patterns of misuse, after deployment.
- Publicly report advanced AI systems’ capabilities, limitations and domains of appropriate and inappropriate use and work towards responsible information sharing and reporting of incidents.
- Develop, implement and disclose AI governance and risk management policies, grounded in a risk-based approach.
- Invest in and implement robust security controls, including physical security, cybersecurity and insider threat safeguards across the AI lifecycle.
- Develop and deploy reliable content authentication and provenance mechanisms, such as watermarking or other techniques to enable users to identify AI-generated content.
- Prioritise the development of advanced AI systems to address the world’s greatest challenges, notably but not limited to the climate crisis, global health and education.
This code of conduct is the latest in a recent list of AI initiatives, including the US Executive Order on AI and the Bletchley Declaration. We have explored the growing international regulation of AI here and specific insights on the US Executive Order here along with other updates on our technology hub.
EU AI Act negotiations come to a head over foundation model regulations
On 10 November 2023, a technical meeting was held to discuss the next phase of the EU's AI regulations, first proposed in 2021 and designed to regulate AI using a risk-based approach (the "AI Act"). The AI Act is in the final stage of the legislative process, with only a few final points to be determined.
In this meeting, there was concern about how AI systems that include foundation models (such ChatGPT) should be managed. In previous negotiations on the AI Act, there appeared to be a consensus on a tiered approach to regulation, whereby tougher rules were imposed for the most powerful models which would have the largest impact on society. This follows a similar approach to the Digital Markets Act and Digital Services Act.
However, this approach has now faced criticism from, most notably, France, Germany and Italy. These countries disputed during the meeting that the AI Act, and other regulation of foundation models, could kill AI start-ups and stifle innovation. It is feared that the AI Act could put them behind competing countries in the field of AI development, such as the US and China.
In a subsequent meeting on 19 November 2023, a possible compromise was discussed. The discussions focussed around applying a tiered approach to General Purpose AI and introducing codes of practice for models with systematic risks. This would provide a distinction between General Purpose AI models and systems. This was not agreed by the European Commission.
At the latest discussions on 29 November 2023, Spain, on behalf of the EU countries, provided a revised mandate to negotiate this sticking point. This stipulated similar rules, but with tighter thresholds. The European Commission must now agree to this proposal. Failure to agree an approach on these key areas by 6 December 2023, the goal for finalising an agreement, may mean member states back away from the AI Act to avoid perceived overregulation. In addition, the fractured approach could undermine the AI Act's international influence in AI policy and best practice.
Cyber security
NCSC publishes guidelines for secure AI system development
On 27 November 2023, the National Cyber Security Centre ("NCSC"), in collaboration with the US Cybersecurity and Infrastructure Security Agency and 21 other international agencies, published guidelines for providers of AI systems ("AI System Guidelines"). The AI System Guidelines aim to assist providers in building AI systems that operate as intended, are available when required, and work without revealing sensitive data to unauthorised parties.
The AI System Guidelines are structured according to the security risks associated at each stage of the AI system development lifecycle:
- Secure design stage: Covers understanding risks and threat modelling, specific topics and trade-offs in relation to system and model design.
- Secure development stage: Concerns supply chain security, documentation, and asset and technical debt management.
- Secure deployment stage: Provides guidelines on (i) protecting infrastructure and models from compromised security, threats or losses (ii) creating incident management processes and (iii) responsible release.
- Secure operation and maintenance stage: Considers actions relevant to a system once it has been deployed, this includes logging, monitoring, update management and information sharing.
DSIT publishes updated code of practice for app store operators and app developers
DSIT has updated the world's first code of practice for app store operators and app developers ("Code") to accommodate industry feedback. The Code is centred on enhancing consumer protection, in particular protecting app users from security risks generated by malicious and poorly developed apps. Key changes include clarifying provisions due to industry responses highlighting barriers to implementation, and creating an appeals process for app developers to challenge the removal of their apps from the app store if they breach the Code.
In order to allow industry players sufficient time to adjust to the revised Code, DSIT has extended the implementation period by nine months to 30 June 2024.
ICBC suffers from ransomware attack that disrupted the US Treasury market
On 8 November 2023, China's largest lender by assets, Industrial and Commercial Bank of China ("ICBC") suffered a ransomware attack. It is alleged that the attack took place due to ICBC failing to undertake timely IT system upgrades.
Ransomware attacks involve crippling an organisation's computer system to the point that is unable to function unless the payment of a ransom is made. Since the Covid-19 pandemic, ransomware attacks have increased as attackers, who have become increasingly sophisticated, have capitalised on the security vulnerabilities exposed by remote working.
The attack resulted in ICBC being unable to fulfil US Treasury market trades, in consequence causing ICBC to reroute trades to other banking institutions, and in some instances not settling trades. One impact of this involved ICBC temporarily owing Bank of New York Mellon $9 billion. The attack did impact the US Treasury market's liquidity; however it was not damaging enough to weaken the market’s overall functioning.
It is reported that the attack was carried out using 'LockBit 3.0 software'. The creator of the software, LockBit, is a high-profile criminal cyber group which has been responsible for similar attacks on other large organisations, such as the City of London and Royal Mail. Following the attack, LockBit claimed responsibility and reportedly ICBC paid LockBit a ransom fee. Although controversial, it may not be unusual to make ransom payments rather than endure the leaking or deletion of sensitive data accessed in the attack.
Ransomware group reported their own victim for failing to disclose cyber attack
On 15 November 2023, the ALPHV/BlackCat ransomware group filed a complaint with the US Securities and Exchange Commission ("US SEC") about software company MeridianLink, for failing to comply with the four business day rule to disclose a cyber attack. It is believed that this is the first instance in which a ransomware gang has submitted a complaint to the US SEC against one of its victims. ALPHV/BlackCat alleged that on 7 November 2023, it infiltrated MeridianLink's systems and stole data. Following this, ALPHV/BlackCat requested a ransom fee from MeridianLink, requiring payment within 24 hours of the cyber attack. According to ALPHV/BlackCat, MeridianLink failed to respond to the ransom payment request. It is alleged that the victim's lack of response prompted ALPHV/BlackCat's complaint to the US SEC – the ransomware group also published a screenshot of its complaint on its website on 15 November 2023.
Enforcement and civil litigation
Meta faces EU ban on relying on legitimate interests and contract to process personal data for behavioural advertising. ban on targeted advertising
The EDPB instructed the Irish DPC to impose a ban on the processing of personal data for behavioural advertising on the legal bases of contract and legitimate interest across the European Economic Area ("EEA"). It follows Meta shifting its legal basis from consent to legitimate interest in March, following the DPC's finding in January that Meta could not rely on contract as a legal basis for the processing. The ban inherently relates to Meta's processing as a controller, as it concerns Meta's lack of a legal basis for the processing, which is the responsibility of a controller to establish.
The ban only relates to Meta "targeting ads on the basis of inferences drawn from observed behaviour as well as on the basis of data subjects’ movements, estimated location and how data subjects interact with ads and user-generated content." The ban would not prevent Meta from processing personal data for advertisement purposes more generally. It does not ban the use of legitimate interest as a ground for carrying out all types of online advertising by everyone. It applies to the specific circumstances of Meta's processing activities. Non-compliance with the EU/EEA-wide ban could lead to Meta being sanctioned with up to €20m or 4% of global turnover.
In order to establish liability of a business user for Meta's lack of a lawful basis, it would need to be demonstrated that the business user's use of Meta pixel in some way related to Meta's subsequent processing for its own targeting purposes. It is likely that this would be very difficult to establish.
In response to the ban, Meta plans to move to consent as the legal basis for its behavioural advertising activities in respect of users in the EEA. It is introducing a subscription model, under which users who do not consent to share their personal data and receive targeted adverts will be charged a monthly fee, which could result in customers paying up to €251.88 per year just to retain their fundamental right to data protection. This "pay or okay" model faces significant criticism and has already been the subject of complaints to supervisory authorities, including from NOYB.
CNIL imposes multiple fines under simplified procedure
France's data protection regulator, the CNIL, fined ten companies a total of €97,000 over two months under its new simplified sanction procedure. This procedure was introduced in April 2022 to help the CNIL handle simple GDPR cases more efficiently. For cases where breaches of GDPR require more severe penalties, decisions are taken under the ordinary procedure.
These fines were in response to various complaints, including geolocation of company vehicles, employee video surveillance, data minimisation and the right of individuals to object. One of the key infringements concerned the violation of the rights of individuals. The CNIL pointed out that the right of employees' right to privacy was infringed because of continuous recording of geolocation data of employee vehicles, which continued even during break times, without justification.
The CNIL reaffirmed its position that personal data generated from the use of video surveillance systems to constantly film employees for no reason is not appropriate. The simplified sanction procedure sends a message the CNIL is stepping up its GDPR enforcement activity.
EasyJet ICO investigation dropped
Early in November, the ICO announced its decision to drop an investigation into the 2020 EasyJet data breach due to "limited resources", which it considered would be better used elsewhere.
In 2020, EasyJet was subject to a severe cyber-attack that led to the data of nine million EasyJet customers being exposed. Information such as email addresses, travel details and credit and debit card details were accessed. Although EasyJet became aware of the breach in January 2020, they did not notify those who suffered from the attack until five months after. Consequently, customers were left susceptible to heightened security threats such as phishing, financial fraud and identity theft.
The decision comes as a surprise, given the ICO's response to British Airways' data breach in 2018. The breach, which led to the data of half a million customers being harvested, was met by a proposed ICO fine of £183 million in July 2019 before it was reduced upon appeal to £20 million in October 2020.
Security practitioners have described the ICO's decision as "deeply concerning", claiming that it could send the wrong message to organisations in the future. However, the ICO's decision should not be seen as an indication that it is ‘easing up’ or that data breaches will be tolerated.
Ireland finds Airbnb in violation of legitimate interests processing, data minimisation and storage limitation
On 28 September 2023, the Irish Data Protection Commission ("DPC") found that Airbnb Inc was non-compliant with the GDPR in relation to the lawful bases for processing, data minimisation and storage limitation.
A complaint was made in October 2019 after an Airbnb user attempted to verify a guest booking but could only do so by first uploading ID and additional supporting photos. The ID was held by Airbnb until February 2021, before being deleted as part of the response to the complaint, whilst the supplementary photos remained held by Airbnb.
'Legitimate interests' was relied upon as the lawful basis for processing the ID and supplemental photos. The DPC decided that this had been incorrectly relied upon. Although the DPC acknowledged the presence of a legitimate interest, it was deemed unnecessary for Airbnb to process ID and supplementary photos as the initial response – alternative methods of validating identity should have been explored first. It was also noted that Airbnb had prioritised the rights of the host as opposed to the rights of the guest.
The principle of data minimisation is that personal data should be adequate, relevant, and limited to what is necessary in relation to the purposes for which they are processed. The DPC decided that this principle was not adhered to in the course of Airbnb's requirement for unredacted ID and supplementary photos. In addition, the DPC found that the processing of redacted ID photos after the verification process had infringed the storage limitation principle. This principle stipulates that personal data will be kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which the personal data are processed.
The DPC ordered Airbnb to delete the supplementary photos and change internal procedures to comply with data protection law.
Round-up of enforcement actions
Company | Authority | Fine | Comment |
Morgan Stanley | The attorney general offices of New York, Connecticut, Florida, Indiana, New Jersey, Vermont | $6.5 million | The bank compromised the personal data of costumers after revealing that it had sold corporate computers that held unencrypted personal data and lost servers potentially containing unencrypted customer information. |
Quality Provider S.A. | Spain DPA | €20,000 | The company had processed the personal data of a data subject without a valid legal basis and had not sufficiently cooperated with the DPA. |
Foro Asturias | Spain DPA | €20,000 | An individual had filed a complaint with the DPA as personal data stored by the company had been disclosed to a media company without authorisation, which then published the data in a newspaper. |