Data Protection update - June 2021
Welcome to our Data Protection bulletin, covering the key developments in data protection law from June 2021.
Data protection
- The European Commission adopts adequacy decisions for the UK
- TIGRR calls for the UK GDPR to be replaced
- The EDPB has adopted its final Recommendations on measures that supplement transfer tools
- A draft of the ICO's first chapter of its new guidance on anonymisation, pseudonymisation and privacy enhancing technologies has been published
- ICO raises concerns about the use of live facial recognition technology in public spaces and the issues surrounding data protection compliance
Cyber security
Enforcement
- ICO PECR enforcement roundup
- Record breaking $425 million fine proposed for Amazon privacy violations
- US tech companies exposed to EU-wide regulation after EU court decision
- IKEA France fined €1 million for spying on staff
Civil litigation
- Article 27 Representatives not liable for data controllers' data protection breaches
- ICCL's action aims to prevent Real-Time Bidding by challenging advertising industry rules
- Italian Supreme Court holds that individuals cannot consent to processing by an algorithm if they are not adequately informed of its underlying logic
Data protection
The European Commission adopts adequacy decisions for the UK
On 28 June 2021, the European Commission ("EC") adopted two adequacy decisions in relation to the United Kingdom, which ensure the ongoing transfer of personal data between the EEA and the UK post-Brexit (the "Decisions"). In its press release, the EC has confirmed that: "Personal data can now flow freely from the European Union to the United Kingdom where it benefits from an essentially equivalent level of protection to that guaranteed under EU law."
This may be unsurprising to some as the UK's data protection law at present is almost identical to the EU's GDPR, though there are minor discrepancies between the two regimes. However, in the face of growing criticism by privacy NGO's, it seems almost inevitable that the CJEU may be required to intervene and re-assess the UK's adequacy status in the not-too-distant future. In any event, however, the legal situation in the UK will be kept under continual review and the inclusion of a first-of-its-kind sunset clause in the Decisions means that that the they will automatically expire after 4 years unless renewed.
What is more, it has since been reported (in an online article published by the Telegraph on 19 June 2021, accessible here), that the UK Government may be of the opinion that British judges should not automatically be bound by judgments made by the European Court of Human Rights ("ECHR"). The UK's adequacy status may be jeopardised if the UK Government were to adopt a more formal stance in this respect. In fact, provision is expressly made in the Decisions for such a divergence: “This conclusion is based on both the relevant UK domestic regime and its international commitments, in particular adherence to the European Convention of Human Rights and submission to the jurisdiction of the European Court of Human Rights. Continued adherence to such international obligations is therefore a particularly important element of the assessment on which this Decision is based.”
It should be noted that, following the recent Court of Appeal decision which held that the UK's immigration exemption was not compatible with the GDPR (contained in paragraph 4 of Schedule 2 to the DPA 2018) (reported on in our previous update, accessible here), the Decision reinforces the ECHR's jurisdiction and expressly excludes any transfers that cover "personal data that is transferred for purposes of United Kingdom immigration control or that otherwise falls within the scope of the exemption…" (Recital 6).
TIGRR calls for the UK GDPR to be replaced
Just as the Decisions on the adequacy of the UK's data protection regime were being finalised, on 16 June 2021, the Taskforce on Innovation, Growth and Regulatory Reform ("TIGRR") published a report on its regulatory vision for the UK. The report, by a group of MPs, outlines their recommendations to the Prime Minister on how to seize new opportunities that have arisen post-Brexit. Of particular note, TIGRR's report criticised the UK GDPR, stating that in practice it "overwhelms people with consent requests and complexity they cannot understand, while unnecessarily restricting the use of data for worthwhile purposes". The report encouraged the introduction of a new data protection framework: "We propose reform to give stronger rights and powers to consumers and citizens, place proper responsibility on companies using data, and free up data for innovation and in the public interest." If the government choose to move forward with the recommendations, it remains to be seen how this would impact the UK's adequacy status.
The EDPB has adopted its final Recommendations on measures that supplement transfer tools
On 21 June 2021, the European Data Protection Board ("EDPB") adopted its final recommendations on measures that supplement transfer tools to ensure compliance with the EU level of protection of personal data (the "Recommendations"). This marks an important (and long-awaited) milestone for those organisations who transfer personal data from within to a country outside of the EEA. Following the conclusion of a public consultation, a number of changes were made to the first draft, which was previously introduced in November 2020, shortly after the CJEU's decision in Schrems II (the "Draft Recommendations") (please see our previous analysis here and here).
The most notable of these changes are outlined below:
- While the Draft Recommendations recognised the need for organisations to assess relevant legislation when conducting a Transfer Impact Assessment ("TIA"), the Recommendations emphasise that organisations should also take into account the practices in place in the third country.
- In particular, if an organisation's TIA reveals that the transfer of international data may fall within the scope of "problematic legislation", the Recommendations make clear that an organisation must ascertain that it has "no reason to believe that relevant and problematic legislation will be interpreted and/or applied in practice". Annex 3 to the Recommendations contains a list (organised in order of preference) of the possible sources of information that may be referred to in assessing such practices. The assessment must be documented in order to demonstrate that problematic legislation will not be applied in practice to the transferred data. This is a demanding requirement that will require some thinking through in order to apply it in practice.
- The Draft Recommendations asserted that organisations should not rely on subjective factors, such as the actual likelihood of public authorities' access to data in a manner not in line with EU standards, when making an assessment of the third country's legal system. However, the Recommendations seem to have adopted a wider stance in this regard, accepting that when a TIA is being conducted, data exporters may take account of "documented practical experience of the importer with relevant prior instances of requests for access received from public authorities in the third country." However, it should be noted that a mere absence of prior requests made to the recipient will not be sufficient – this will need to be backed up by relevant, objective, reliable, verifiable, and publicly available or otherwise accessible information.
- While the Recommendations continue to adopt a narrow interpretation of the derogations contained in Article 49 GDPR, the use of such Article is no longer stated as being limited to "occasional and non-repetitive transfers".
- As a result of the decision in Schrems II, the Recommendations refer to the fact that when binding corporate rules are used as transfer safeguards, they may be subject to additional requirements that will be provided at a future date.
While the Recommendations are not legally binding, they will carry significant weight and likely influence the decisions reached by EU supervisory authorities and in EU courts. Clearly, the Recommendations will not apply in the UK, but since Schrems II continues to apply in the UK, they provide a strong indication of the approach that will be expected here too. It would therefore be a good idea for organisations who conduct TIAs to familiarise themselves with the six-stage process contained in the Recommendations.
A draft of the ICO's first chapter of its new guidance on anonymisation, pseudonymisation and privacy enhancing technologies has been published
The ICO has called for views on the draft first chapter of its anonymisation, pseudonymisation and privacy enhancing technologies guidance (the "Draft Guidance"). The Draft Guidance is intended to clarify the issues that should be considered by organisations so as to ensure that anonymisation techniques are used effectively.
The first chapter of the Draft Guidance provides an introduction to two key concepts that are relevant to data protection law in the UK.
- "Anonymisation" is defined as "the way in which you turn personal data into anonymous information, so that it then falls outside the scope of data protection law".
- "Pseudonymisation" is defined as "a technique that replaces or removes information that identifies an individual".
The Draft Guidance states that while anonymisation is certainly not always necessary, or even desired, organisations may wish to anonymise information so that it no longer constitutes "personal data" which is subject to data protection requirements. Anonymisation therefore provides organisations with an opportunity to limit the data protection risks to which they are exposed. However, it should also be noted that the ICO has confirmed that the methods used to turn personal data into anonymous information still constitute processing.
The Draft Guidance refers to pseudonymisation as a "security and risk mitigation measure" with multiple benefits. For example, pseudonymisation techniques may simplify the process of data protection compliance, while also reducing the amount of data that organisations are required to consider when responding to requests from individuals. The Draft Guidance confirms that pseudonymous data still constitutes personal data and so the processing of such data must comply with data protection legislation.
The ICO's consultation closes on 28 November 2021, however, it is anticipated that future chapters of the guidance may be published before this date.
ICO raises concerns about the use of live facial recognition technology in public spaces and the issues surrounding data protection compliance
On 17 June 2021, the ICO published an official opinion on the use of live facial recognition technology ("LFR") in public spaces (the "Opinion"). Within this document, the ICO provided details in relation to its investigation of six examples of planned or actual use of LFR in public spaces. In each case, it was discovered that data protection legislation had not been fully complied with. The ICO further assessed nine data protection impact assessments ("DPIAs") in order to inform its assessment of the data protection issues arising from the use of LFR in public spaces.
In a blog post, Elizabeth Denham, the Information Commissioner, stated that she was "deeply concerned about the potential for live facial recognition (LFR) technology to be used inappropriately, excessively or even recklessly." She continued by stating that "It is not my role to endorse or ban a technology but, while this technology is developing and not widely deployed, we have an opportunity to ensure it does not expand without due regard for data protection."
The Opinion states that the ICO is conducting further investigations into the use of LFR in public spaces and it may not be long before we see enforcement action in this area.
Shortly after the ICO published the Opinion, the EDPB and the European Data Protection Supervisor ("EDPS") adopted a joint opinion which effectively called for a general ban on the use of LFR in public areas. Andrea Jelinek, EDPB Chair and Wojciech Wiewiórowski, EDPS, said: “Deploying remote biometric identification in publicly accessible spaces means the end of anonymity in those places. Applications such as live facial recognition interfere with fundamental rights and freedoms to such an extent that they may call into question the essence of these rights and freedoms."
Cyber security
Cybersecurity risks feature on this year's G7 Leaders' Summit agenda
This month, leaders from across the globe met in the UK for the G7 Leaders' Summit. Throughout the proceedings, cyber security risks were identified as an "escalating shared threat", with cyber ransomware attacks stated as requiring urgent attention. Central to such discussions were calls for the Russian Government to hold accountable those individuals or groups located within Russia that are responsible for ransomware attacks. This follows a speech given by Lindy Cameron, chief executive of the National Cyber Security Centre, who stated that cybercriminals "don't exist in a vacuum". She followed by saying "they are often enabled and facilitated by states acting with impunity".
Enforcement
ICO PECR enforcement roundup
The ICO has, in the last month, issued fines totalling £445,000 to six organisations for nuisance communications in breach of regulation 22 of the Privacy and Electronic Communications Regulations (“PECR”):
- Colour Car Sales Ltd ("CCSL") received a £170,000 fine for sending 3,650,194 spam text messages between October 2018 to January 2020;
- Solarwave Limited ("SL") received a £100,000 fine for making 73,217 unsolicited calls between January and October 2020 to people registered with the Telephone Preference Service list ("TPS");
- LTH Holdings ("LTH") received a £145,000 fine for making 1.4 million calls between May 2019 and May 2020 to people registered with the TPS;
- The Conservative Party received a £10,000 fine for sending 1.19 million direct marketing emails (although there was no way to know the proportion of these that were validly sent) between 24 July and 31 July 2019;
- Global One 2015 received a £10,000 fine for sending 573,000 direct marketing emails between April 2020 and May 2020; and
- Papa John's (GB) Limited received a £10,000 fine for sending over 210,000 nuisance text messages between 1 October 2019 and 30 April 2020.
In addition to being fined, CCSL, SL and LTH were issued with Enforcement Notices, ordering them to stop sending the nuisance text messages.
Record breaking $425 million fine proposed for Amazon privacy violations
The Data Protection Authority ("DPA") of Luxembourg (the "CNPD") has reportedly proposed a fine of over $425 million to be issued to Amazon.com Inc. If approved by the other Supervisory Authorities across the EU, this would be the highest fine imposed under the GDPR to date (dwarfing the €50 million fine issued to Google by CNIL which we covered in our January 2019 update and the appeal in our June 2020 update). However, whilst the absolute amount of this fine, is eye-watering, it is worth bearing in mind the fine represents approximately 0.1% of Amazon's worldwide turnover.
Before the fine is issued, the other EU Supervisory Authorities must approve the draft decision under the GDPR’s cooperation procedure. If they fail to resolve objections amicably, there will be a debate and vote among the Supervisory Authorities on the EDPB. It is reported that CNPD has received at least one objection stating that the fine should, in fact, be higher.
Media reports suggest that the case relates to Amazon's collection and use of personal data, rather than to the cloud-computing business Amazon Web Services (in our March update we reported on a decision of the French Supreme Court regarding a subsidiary of Amazon Web Services). It remains to be seen whether the action taken by the CNPD is linked to allegations which were reportedly raised earlier this year by former Amazon employees in connection with the company’s data security practices.
US tech companies exposed to EU-wide regulation after EU court decision
The CJEU has ruled that national data regulators have the power, in certain circumstances, to pursue their own cases against tech companies even if they are not the lead regulator for the relevant tech firm.
Under the GDPR's "one-stop-shop" mechanism, organisations that conduct cross-border data processing are required to work primarily with their lead supervisory authority, which will be the supervisory authority based in the same Member State as the organisation’s main establishment (usually its EU headquarters).
There are two principal exceptions to the "one-stop-shop" mechanism: firstly, if the subject matter relates only to an establishment in its Member State, or substantially affects data subjects only in its Member State (Article 56(2) GDPR); and secondly, the "urgency procedure" under Article 66 GDPR where, in exceptional circumstances (where it is considered that there is “an urgent need to act in order to protect the rights and freedoms of data subjects”), the supervisory authority can immediately adopt provisional measures and can thereafter refer the matter to the EDPB for an urgent opinion or binding decision.
In the instant case, the Belgian Supervisory Authority had opened a probe in 2015 against Facebook regarding insufficient notification to users about data collection and use. Facebook had argued that the complaint should have been heard in Ireland, rather than in Belgium, given the location of its headquarters (and the identity of its lead Supervisory Authority, the Irish Data Protection Commissioner (“Irish DPC”)).
The Belgian Supervisory Authority had sought various information from the Irish DPC in connection with its probe, but had not received a response. This lack of co-operation engaged Article 61(8) GDPR, which provides that, where a Supervisory Authority does not provide information requested by another Supervisory Authority, the requestor may adopt provisional measures and may require an urgent binding decision from the EDPB pursuant to Article 66(2). In these circumstances, the urgent need to act under Article 66(1) is presumed to have been met.
This decision may encourage Supervisory Authorities to take matters into their own hands, where permitted by the GDPR, rather than awaiting action from other Supervisory Authorities. This is a particular opportunity for Supervisory Authorities looking at big tech firms which are often registered in Ireland (e.g. Google, Microsoft, and Twitter). We have previously reported (in our May update) that the European Parliament has adopted a resolution calling for infringement procedures to start against the Irish DPC because of an "insufficient level of enforcement of the GDPR". This could give an opportunity to increase enforcement by alleviating some of the responsibility the Irish DPC has had.
IKEA France fined €1 million for spying on staff
A criminal court in Versailles has fined the French subsidiary of Ikea ("Ikea France") €1 million for spying on hundreds of employees between 2009 and 2012. Former CEO, Jean-Louis Baillot, has also been ordered to pay €50,000 for storing personal data, and has been handed a two-year suspended sentence. The executive in charge of risk management, Jean-Francois Paris, has been fined €10,000 and given an 18-month suspended sentence.
In court, a store manager explained how he had obtained personal data from his cousin in the police by asking him to "cast an eye" over 49 candidates selected for Ikea jobs. That led to three candidates, who had committed minor offences, having their job offers withdrawn.
The scale of the spying activities was made plain when the court was reportedly told that Ikea France's annual bill for private investigators ran to as much as €600,000 and the surveillance covered around 400 people.
Civil litigation
Article 27 Representatives not liable for data controllers' data protection breaches
A notable impact of the GDPR's introduction has been that data controllers based outside of the EEA, which offer their goods or services to EU citizens, have been required to appoint an EEA representative under Article 27 of the GDPR. Following the case of Sansó Rondón v LexisNexis Risk Solutions UK Limited [2021] EWHC 1427 (QB), there is now clarity about the position of Article 27 representatives in circumstances where the non-EEA data controller has breached data protection requirements.
The Defendant in the case was a designated representative of World Compliance Inc ("Worldco"), which is a US company. The Claimant issued a claim against the Defendant in relation to a number of alleged breaches of the GDPR by Worldco's processing of the Claimant's personal data via its WorldCompliance. The Defendant applied for strike out/summary judgment, on the basis that, as an Article 27 representative, it could not be held liable for the actions of the controller which had appointed it.
The Defendant was successful in having the claim against it struck-out. Collins Rice J held that Article 27 does not create “representative liability” – that is to say, representatives are not required “to stand in the shoes of a controller as a respondent/defendant to enforcement action”.
The Judge could find “no positive encouragement for ‘representative liability’ anywhere other than the last sentence of Rec. 80” (which reads as follows: “The designated representative should be subject to enforcement proceedings in the event of non-compliance by the controller or processor'”). Despite posing a “challenge” to her ultimate view, Recital 80 was dismissed by the Judge as an “interpretative sidewind”, insufficient to displace the “weighty” proposition that “if the GDPR had intended to achieve ‘representative liability’ then it would necessarily have said so more clearly in its operative provisions”.
Whilst the ICO did not seek to intervene in the proceedings, it nevertheless expressed the following view on the interpretative question at issue:
"the role of an Article 27 representative (…) is limited to that of conduit of communications between the overseas entity and the ICO or relevant data subjects".
Notably, Collins Rice J’s conception of the role of an Article 27 representative was more expansive than that espoused by the ICO. The language of “conduit”, she noted, “does not fully capture the job the GDPR gives to representatives”. The role is “considerably fuller…than a mere postbox ‘to be addressed’”; “[a]t its core is a bespoke suite of directly-imposed functions…crafted to fit together with, and belong in the triangle of, the relationships between controller, ICO and data subject”; “[t]he job focuses on providing local transparency and availability to data subjects, and local regulatory co-operation”.
The Judge has granted the Claimant permission to appeal.
ICCL's action aims to prevent Real-Time Bidding by challenging advertising industry rules
The Irish Council for Civil Liberties ("ICCL") has announced that one of its senior fellows, Dr Johnny Ryan, has filed an action before the Hamburg District Court against IAB Technology Laboratory, Inc, the affiliated standard setting body of the Interactive Advertising Bureau, and various other Defendants in respect of allegations that the online advertising industry breaches the GDPR, and domestic German legislation, in particular through the use of real-time bidding ("RTB").
RTB is a core tool utilised by participants in the adtech ecosystem to place online ads. Together with the practices of the adtech industry more broadly, it has become the subject of increasing scrutiny amongst privacy campaigners and regulators across various jurisdictions, including Ireland, Belgium and UK, over the last few years.
Italian Supreme Court holds that individuals cannot consent to processing by an algorithm if they are not adequately informed of its underlying logic
The Italian Court of Cassation has held that the reliance on data subjects’ consent for the proposed processing of personal data, using an automated system designed to assess the reputation of individuals, was unlawful.