Data Protection update – September 2022
Welcome to our data protection bulletin, covering the key developments in data protection law from September 2022.
Data protection
- The government calls for an open consultation on unauthorised access to online accounts and personal data
- ICO issues guidance on Privacy-Enhancing Technologies
- Data Protection and Digital Information Bill paused as DCMS hints at further changes
- Retained EU Law (Revocation and Reform) Bill introduced to parliament
Cyber security
- Cyber Resilience Act proposed by the European Commission
- Electronic Communications (Security Measures) Regulations set to introduce tougher obligations for telecoms providers
Enforcement
- TikTok at risk of £27 million ICO fine for breaches of children's privacy laws
- ICO reprimands Virgin Media over DSAR response delays
- Record €405 million fine issued by Irish Data Protection Commission against Instagram
- EDPB rules on conflicting fines issued by two supervisory authorities
Civil litigation
- Belgian Market Court refers the Belgian Data Protection Authority's IAB Europe ruling to CJEU
- German court clarifies Schrems II application in relation to subsidiaries of US-parent companies
- CJEU Advocate General issues opinion on the relevance of EU GDPR in competition authority decisions
Data protection
The government calls for an open consultation on unauthorised access to online accounts and personal data
The Home Office has issued a call for information (the "Call") to seek to understand how to reduce both the burden of responsibility placed on individuals regarding cyber security and cyber-crime.
The scope of the Call outlines that, despite a range of legislation including the Computer Misuse Act 1990; the UK's General Data Protection Regulation ("UK GDPR"); and the Data Protection Act 2018 ("DPA"); account access "remains a serious point of potential vulnerability which may lead to unauthorised access and cyber-crime". There are reasons suggested for this in the Call outline, including password security weaknesses, but the Home Office seeks to obtain the views of individuals and businesses on:
- The risks and harms associated with unauthorised access to online accounts and personal data;
- Actions currently taken to address the problem; and
- Actions that should be taken to address the problem and who should be responsible for such actions.
One interesting feature of the Call is the emphasis it places on the need for businesses to adapt to an era where the security burden on individuals has increased; suggesting that online account login processes should be secure by default, rather than over-reliant on customers taking protective actions. As we reported in our April bulletin (here), cyber security breaches continue to increase in frequency with over a third of UK businesses identifying a cyber-attack in the last 12 months. The Call is intended to support the government in developing new initiatives to protect the public online by keeping the UK's cyber security up to date.
The consultation can be accessed here and responses can be submitted until 27 October 2022.
ICO issues guidance on Privacy-Enhancing Technologies
On 7 September 2022 the Information Commissioner's Office ("ICO") released the fifth chapter in its draft anonymisation, pseudonymisation and privacy enhancing technologies guidance (the "Draft Guidance"). In our June 2021 bulletin we discussed the first chapter of the Draft Guidance (which introduces anonymisation); in our October 2021 bulletin we discussed the second chapter (which discusses the effectiveness of anonymisation), and we discussed the third chapter of the Draft Guidance in our February 2022 bulletin.
Chapter five of the Draft Guidance focuses on Privacy-Enhancing Technologies ("PETs") and aims to provide organisations with the help they need to demonstrate a data protection by design approach.
PETs are described in the Draft Guidance as technologies that "embody fundamental data protection principles by minimising personal data use, maximising data security, and/or empowering individuals".
The Draft Guidance groups PETs into three functional groups:
- PETs that reduce the identifiability of individuals from data such as synthetic data. These reduce risk by minimising the quantity of personal data processed but, by their nature, may reduce the quality or utility of data to the extent it is not genuine data.
- PETs that hide data such as homomorphic encryption which permits processing on encrypted data without the plaintext being discovered.
- PETs that split sets of data or introduce access controls to portions of data. This type of PET reduces the identifiability risk, and thus improves the security of the processing.
The Draft Guidance explains in detail how PETs work and the ways in which they can be used by data controllers, in particular to improve the security of the processing of personal data and demonstrate data protection by design and default. However, the Draft Guidance also identifies possible drawbacks of PETs such as a potentially immature market and lack of expertise to use PETs effectively.
The Draft Guidance explains that PETs could allow companies to share and collaborate on the analysis of data, including sensitive data, whilst still maintaining privacy. That would provide significant opportunity for big-data innovation without compromising on the legal responsibilities of such a company.
The ICO's consultation on the Draft Guidance is open until 31 December 2022 and this chapter of the Draft Guidance can be read here.
Data Protection and Digital Information Bill paused as DCMS hints at further changes
The second reading of the Data Protection and Digital Information Bill (the "Bill") has been delayed following the appointment of Elizabeth Truss as Prime Minister, to give Ministers more time to consider the Bill. In our July bulletin we did a deep dive into the changes which the Bill proposes to update and simplify the UK's data protection framework. The Bill will make extensive amendments to the UK GDPR, the DPA and the UK Privacy and Electronic Communications Regulations as part of the UK Government's approach to retained EU law. The second reading of the Bill was scheduled for 5 September 2022 and no new date has yet been set.
The progress of the Bill and the full text can be found here.
On 3 October 2022, at the Conservative Party Conference, the Secretary of State for Digital, Culture, Media & Sport ("DCMS"), Michelle Donelan, hinted that the Bill may be subject to change before it is reintroduced to Parliament. She said: "We will be replacing GDPR with our own business- and consumer-friendly British data protection system" and criticised what she considers to be the "needless regulations and business-stifling elements" of the current regime. DCMS will instead be "taking the best bits from others around the world to form a truly bespoke, British system of data protection". As the Bill in its current form does still retain GDPR at its core, this suggests that DCMS may still intend to make significant changes to it.
Retained EU Law (Revocation and Reform) Bill introduced to Parliament
The Retained EU Law (Revocation and Reform) Bill (known as the "Brexit Freedoms Bill") is set to remove the prioritised status of EU law that was retained by the UK post Brexit. It shall make it easier for Parliament to amend, repeal or replace the estimated 2,400 pieces of retained EU law in the UK. The Brexit Freedoms Bill contains a sunset date by which all EU law that has been retained must either be revoked or assimilated into UK law. This will impact many different provisions, notably in a data protection context the UK GDPR and significant European data protection case law.
The Brexit Freedoms Bill has the following key provisions:
- Sunsetting Retained EU law: the majority of retained EU Law will expire on 31 December 2023, unless it is otherwise preserved.
- Ending supremacy of retained EU law: the UK Government will ensure domestic UK legislation takes priority over retained direct EU legislation.
- Assimilated law: any retained EU law preserved after 31 December 2023 will be assimilated and the EU interpretation features will no longer apply.
- Facilitating departures from retained EU case law: the Brexit Freedoms Bill will provide domestic UK courts with greater discretion to depart from retained case law.
- Modification of retained EU law: the status of retained direct EU legislation will be downgraded; and powers in other statutes will be modified to facilitate their use to amend retained EU legislation more easily; subject to appropriate consideration.
In the absence of any indication from the government as to which regulations it intends to revoke, retain or amend under the Brexit Freedoms Bill (or any timelines for when this information will be provided), as the UK GDPR is EU direct legislation retained in the UK we are left with the suggestion that it will disappear unless it is preserved before 31 December 2023. It is possible that it may retained under the Bill, in whatever form it takes when it is reintroduced to Parliament.
The government is maintaining a dashboard of the status of retained EU law, including key data protection legislation, available here.
Cyber security
Cyber Resilience Act proposed by the European Commission
The proposal of the Cyber Resilience Act ("CRA") has been published by the European Commission. Its focus is to provide cybersecurity requirements for products with digital elements and strengthen cybersecurity rules to ensure more secure hardware and software products across the EU Member States. The CRA has been proposed against an increase in the volume and impact of cyber-attacks to both individuals and companies, leading to an estimated global annual cost of cybercrime of €5.5 trillion annually by 2021.
The CRA defines products with digital elements as "any software or hardware product and its remote data processing solutions, including software or hardware components to be placed on the market separately". The key focus targets of the CRA are manufacturers within the EU, but it will also extend, in some areas, to distributers and importers as well. The CRA will not apply to such products which have already been introduced to the EU market unless they have been substantially modified. In addition, the CRA does not apply to cloud services such as software-as-a-service products which will be regulated by the draft NIS2 Directive which we discussed in our May 2022 bulletin.
The CRA sets out two main objectives along with specific objectives. The two main objectives are:
1. Creating conditions for the development of secure products by reducing vulnerabilities for the whole of a product's life cycle; and
2. Providing cybersecurity information to individuals when selecting and using products with digital elements.
The CRA provides for Member States to appoint national market surveillance authorities to carry out surveillance in the relevant territory. In addition, Member States have the power, under the CRA, to set administrative fines for non-compliance among other corrective measures. The CRA sets the maximum fines as:
1. Up to €15,000,000 or, if the offender is an undertaking, up to 2.5% of its total worldwide annual turnover for the preceding financial year, whichever is higher for non-compliance with essential cybersecurity requirements in Annex 1 and obligations in Articles 10 and 11;
2. Up to €10,000,000 EUR or, if the offender is an undertaking, up to 2% of its total worldwide annual turnover for the preceding financial year, whichever is higher for non-compliance with any other obligations under the CRA; and
3. Up to €5,000,000 or, if the offender is an undertaking, up to 1% of its total worldwide annual turnover for the preceding financial year, whichever is higher for supplying incorrect, incomplete, or misleading information to notified bodies and market surveillance authorities.
Electronic Communications (Security Measures) Regulations set to introduce tougher obligations for telecoms providers
Following a consultation and survey, the government has confirmed the new Electronic Communications (Security Measures) Regulations will be laid before Parliament under the Telecommunications (Security) Act 2021 (the "Act"). We reported on the consultation and survey in our March bulletin and explained that the new regulations and code of practice, developed with the National Cyber Security Centre and Ofcom (together, the "Package"), would have a significant impact on public telecoms providers.
The Package covers obligations and legal duties including (but not limited to): ensuring that network providers understand and record the risks of security compromises to network architecture and act to reduce them; protecting network management workstations from exposure to incoming signals and the wider internet; and protecting tools that enable the monitoring or analysis of the use or operation of UK networks and services. Following the initial consultation and survey, the Package has changed to clarify that security measures will be targeted at the parts of the network which are most in need of protection. The Package also includes further guidance on national resilience, security patching and legacy network protections for providers.
Ofcom will oversee compliance with the Act and will be given powers allowing them to inspect the premises and systems of telecoms providers along with fining powers.
The Package includes timelines for providers to achieve certain obligations including:
- Taking account of supply chain risks by controlling access and considering their ability to change network operations;
- Understanding their security risk and be able to identify anomalous activity with regular reporting; and
- Protecting software that monitors and analyses their networks.
Public telecoms providers have until March 2024 to meet these obligations.
Enforcement
TikTok at risk of £27 million ICO fine for breaches of children's privacy laws
The ICO has issued a notice of intent to TikTok Inc and TikTok Information Technologies UK Limited (collectively, "TikTok") signalling the ICO's intent to fine TikTok for breaches of data protection legislation.
The ICO identified a number of possible breaches of data protection law at TikTok between May 2018 and July 2020, including: processing of data of children under the age of 13 without appropriate parental consent; failure to provide proper information to users in a concise, transparent and easy to understand manner; and processing of special category data without a lawful basis.
Whilst the ICO's findings are provisional, Information Commissioner John Edwards noted that "Companies providing digital services have a legal duty to put [child privacy] protections in place, but [the ICO's] provisional view is that TikTok fell short of meeting that requirement". Interestingly, the possible breaches cover a period prior to the enforcement window of the Age Appropriate Design Code and the requirement to consider the best interests of the child as a primary factor. We considered that requirement in detail in our May bulletin here. The Commissioner also confirmed that the ICO is carrying out six further investigations into companies with digital services that may have not adequately protected children's data.
If, after consideration of representations made by TikTok, the ICO upholds its provisional findings, TikTok could be liable for a fine of £27 million for the alleged data protection breaches.
The ICO's statement of 26 September 2022 can be read in full here.
ICO reprimands Virgin Media over DSAR response delays
The ICO has reprimanded Virgin Media Limited ("Virgin Media") following an investigation into its compliance with Data Subject Access Requests ("DSARs").
The ICO found that, between July 2021 and April 2022, Virgin Media had not complied with Articles 15 and 12(3) of the UK GDPR in a number of instances when responding to DSARs. In particular Virgin Media was found not to have complied with its obligation to respond to DSARs without undue delay and in any event at least within the statutory timescales.
The ICO used its powers under Article 58 UK GDPR to issue Virgin Media with a reprimand, as opposed to a fine. Reprimands are an enforcement mechanism used by the ICO to give non-compliant companies a period of time to remedy their non-compliance before any fine is made. In Virgin Media's case, the ICO has recommended that Virgin Media take further steps to ensure that DSARs are responded to within the relevant statutory deadlines, and that Virgin Media ensures that it has sufficient staff resourcing to properly respond to DSARs. The ICO has given Virgin Media an initial three month deadline to demonstrate its compliance to the ICO, and the ICO will also require a further update six months from the date of the reprimand.
The original reprimand from the ICO can be read here.
Record €405 million fine issued by Irish Data Protection Commission against Instagram
Meta Platforms Ireland Limited ("Instagram") has been issued with a €405 million fine by the Irish Data Protection Commission ("DPC") for breaches of the EU General Data Protection Regulation ("EU GDPR") in relation to children's personal data collected on Instagram.
The fine follows a binding decision adopted by the European Data Protection Board ("EDPB") after the draft decision from the DPC. The particular facts of the matter relate to the public disclosure of email addresses and phone numbers of child users of Instagram's business account feature and a 'public-by-default' setting for the personal accounts of children using Instagram. The EDPB's decision considered a variety of objections from supervisory authorities in Germany, France, Finland, Italy, the Netherlands, and Norway. In light of these objections, the EDPB's required the addition of an infringement of Article 6(1) GDPR relating to whether the 'public-by-default' processing was necessary or proportionate. Accordingly, the specific breaches by TikTok of EU GDPR in respect of which the DPC issued its fine are of:
- Article 5(1)(a) and Article 5(1)(c) in relation to fair processing;
- Article 6(1) in relation to lawful processing;
- Article 12(1) on transparency;
- Article 24 in relation to Instagram's responsibilities as a data controller;
- Article 25(1) and Article 25(2) in relation to the protection of data by design and default; and
- Article 35(1) in relation to data protection impact assessments.
The fine is one of the largest issued under the EU GDPR, and the largest issued by the DPC. Instagram is reportedly considering appealing the fine. The DPC's public statement relating to the fine can be read here.
EDPB rules on conflicting fines issued by two supervisory authorities
The EDPB has exercised its powers under Article 65 GDPR to resolve a discrepancy between fines made by two supervisory authorities in different jurisdictions.
An initial €100,000 fine was issued to Accor S.A. ("Accor") by the French data protection supervisory authority ("CNIL") for breaches of EU GDPR. The breaches related to Accor failing to take into account the right to object to the receipt of marketing messages and difficulties in exercising the right of access. Various supervisory authorities across the EEA had received complaints about Accor and, as the main establishment of Accor was in France, the CNIL was the lead supervisory authority for the investigation. The CNIL issued the fine following the circulation of its draft decision to the other supervisory authorities which based the quantum on the most up to date financial information available: the 2019/2020 financial year.
Under Article 60 EU GDPR, the CNIL was under an obligation to cooperate with other relevant data protection authorities to reach a mutually accepted application of the EU GDPR and, in doing so, the Polish data protection authority ("Polish DPA") notified the CNIL of its objection to the quantum of the fine. The Polish DPA did not believe the fine proposed would be effective, proportionate or dissuasive, given that it accounted for less than a twentieth of a percent of Accor's turnover. The CNIL had taken into consideration, within the quantum calculation, that the infringements had not been structural and Accor had taken corrective measures following the investigation. In addition, the significant drop in Accor's turnover between 2019 and 2020 (due to the Covid-19 pandemic) was considered by CNIL to be a mitigating factor under Article 83(2)(k) EU GDPR.
The EDPB gave three main reasons for agreeing with the Polish DPA, namely that: (i) the relevant preceding year's turnover for the fine should be the year preceding the final decision (2020/2021) rather than the most up to date financial information when the decision was circulated to the other supervisory authorities (2019/2020); (ii) the relevant turnover figure should not be double counted as both a mitigatory factor and a cap on the limit of any fine; and (iii) the fine issued was not sufficiently large to be dissuasive taking into account the turnover of Accor.
The EDPB increased the fine to €600,000 to achieve the effect of Article 83(1) EU GDPR by imposing a fine that would be effective, proportionate and dissuasive.
To read the English-language redacted decision of the EDPB in full click here.
Civil litigation
Belgian Market Court refers the Belgian Data Protection Authority's IAB Europe ruling to CJEU
The Belgian Market Court, a part of the Belgian Court of Appeal, has referred a number of preliminary questions to the Court of Justice of the European Union ("CJEU") in relation to the appeal by the Interactive Advertising Bureau Europe ("IAB Europe") of a decision by the Belgium Data Protection Authority ("Belgian DPA"). We previously reported in our February bulletin (here) that IAB's Transparency and Consent Framework (the "TCF") facilitates the management of users' preferences for online personalised advertising and plays a key role in real time bidding. The TCF does this by a consent management platform pop-up on the first use of a website or application. The pop-up acts as an interface in which users can consent to collection and sharing of their personal data or object to various types of processing. The TCF captures these preferences in a code which is placed on users' devices (the "TC String").
IAB Europe did not consider that it was a data controller for the TC String. However, the Belgian DPA held, to the contrary, that the TC String was personal data and therefore IAB Europe was a data controller. In addition, the Belgian DPA found that IAB Europe breached its obligations as data controller pursuant to the EU GDPR in relation to: (i) the lack of lawful basis; (ii) the lack of legal basis for processing; (iii) accountability, security and privacy by design measures (including relating to international transfers); and (iv) various controller obligations. The Belgian Market Court has referred to the CJEU questions about whether the TC String can be considered personal data and render IAB Europe a controller. The CJEU decision, which is not expected until 2023, could have wide ranging implications for online advertising if it finds IAB Europe to be a data controller in line with the Belgian DPA's decision. It could lead to far more onerous data protection obligations being placed on advertising organisations.
A redacted copy of the Belgian Market Court's judgment can be read, in English, here.
German court clarifies Schrems II application in relation to subsidiaries of US-parent companies
The Karlsruhe Higher Regional Court ("OLG Karlsruhe") has ruled that being the subsidiary of a US company is not sufficient on its own to presume that there is an international transfer of personal data.
The case confronting the OLG Karlsruhe centred on a public procurement contract awarded in Germany to a company which would use a Luxembourg-based subsidiary to store personal data with physical servers located in Germany. As part of a review of the reward of public tender contracts, the Baden-Württemberg public procurement chamber ("Vergabekammer BW") deemed that the fact that the data stored could be accessed by the contractor's US parent company constituted a transfer of data under Article 44 GDPR. The transfer would not have attracted the essentially equivalent level of protection for personal data required following Schrems II.
In August, the Baden-Württemburg data protection authority raised concerns with the decision of the Vergabekammer BW. In particular, it was concerned that the risk of potential access was not the same as actual access and that genuine transmission was required to constitute "processing" for the purposes of international transfers of personal data. In addition, the decision of the Vergabekammer BW was questioned with respect to how much consideration it gave to the technical and organisational measures used (such as encryption and adherence to the post-Schrems II SCCs) between the subsidiary and its US-parent company.
The OLG Karlsruhe held that the fact that a company was a subsidiary of a US entity was not enough to assume that it would allow its US parent, in breach of the subsidiary's contractual obligations to the German counterparty, to access personal data stored, contrary to the assumption drawn previously following Schrems II. These contractual obligations constituted a safeguard designed to protect personal data in an international transfer situation which should have been given greater consideration by the Vergabekammer BW. The concern relating to whether genuine access was required to constitute a transfer was not specifically addressed by the OLG Karlsruhe.
The public statement made by the OLG Karlsruhe to communicate their decision can be read here (German only).
CJEU Advocate General issues opinion on the relevance of EU GDPR in competition authority decisions
Athanasios Rantos, an Advocate General of the CJEU, has raised the possibility of competition authorities being able to consider EU GDPR compliance concerns, despite not having jurisdiction to enforce EU privacy laws.
Rantos gave this non-binding opinion to the CJEU on 20 September 2022 after the question of whether competition authorities could take into account EU GDPR was referred to the CJEU by the Düsseldorf Higher Regional Court as part of a challenge by Meta, the parent company of Facebook, WhatsApp and Instagram, against the German Federal Cartel Authority's prohibition of certain data processing activities carried out by Meta's group companies.
In the opinion, Rantos noted that when considering whether a company had abused its dominant market position, a competition authority may interpret "rules other than those relating to competition law, such as those of the [EU] GDPR". When doing so, the competition authority must still have regard to its duty to inform and cooperate with the relevant supervisory authorities in order to interpret the EU GDPR in a way congruent with such data protection authorities. The opinion suggests that the competition authority may consider compliance with the EU GDPR as an incidental question as part of their ruling on competition rules, but competition authorities may not enforce the EU GDPR directly.
Whilst the advocate general's opinion is non-binding, it raises the possibility of competition authorities focussing on data protection provisions when considering breaches of competition rules. The advocate general's opinion can be read in full here.