Data Protection update - March 2021
Welcome to our Data Protection bulletin, covering the key developments in data protection law from March 2021.
Data protection
- Latest developments on overseas transfers of personal data
- UK data transfers: DCMS and the ICO set out process for UK adequacy assessments
- Council adopts position on the proposed ePrivacy Regulation
- European Commission Vice President raises possibility of centralised data protection enforcement
- Proposed amendments to DIFC Data Protection Law
- CNIL issues guidance on the data protection implications related to the use of chatbots
- Europe prepares for the handling of AI data related issues
Cyber security
Regulatory enforcement
- ICO issues fines totalling £330,000 to companies sending spam texts
- ICO issues Enforcement Notice to food company relying on the "soft opt-in" under PECR for electronic marketing
- International transfers: Bavarian DPA finds unlawful use of the newsletter tool: Mailchimp
- Transfers to EU-based processors with US parent companies: French Court decision on applicability of Schrems II
- Spanish Data Protection Authority issues a record-breaking fine €8,150,000 on Vodafone Spain
- Large-scale data breach of health data investigated by the French DPA
Civil litigation
- UK Government confirms no plans to introduce opt-out law for data protection claims
- Tribunal supports ICO’s approach on the granularity of consent required for PECR compliance
Data protection
Latest developments on overseas transfers of personal data
On 25 March 2021, the European Parliament adopted a resolution evaluating the GDPR, two years after its entry into application. The resolution contained some criticisms of the effectiveness of the GDPR and in particular reinforces the European Parliament's position that mass surveillance programmes involving the bulk collection of data should prevent adequacy findings for third countries. The resolution gives a clear direction that the European Commission should apply judgments of the Court of Justice of the European Union (“CJEU”) in Schrems I, Schrems II and the Privacy International case to all adequacy decisions moving forward.
On the same day, it was also announced that the EU and US have decided to step up negotiations on an enhanced EU-US Privacy Shield framework to comply with the CJEU’s judgment last year in Schrems II. In a joint statement issued by the European Commissioner for Justice, Didier Reynders, and US Secretary of Commerce, Gina Raimondo, it was said that: “These negotiations underscore our shared commitment to privacy, data protection and the rule of law and our mutual recognition of the importance of transatlantic data flows to our respective citizens, economies, and societies.”
It was also announced on 30 March 2021 that the adequacy talks between the European Union and the Republic of Korea, which have been ongoing since 2017, have successfully concluded. The European Commission is now taking steps to have its adequacy decision in respect of South Korea adopted in the coming months.
UK data transfers: DCMS and the ICO set out process for UK adequacy assessments
As the UK establishes the process for making its own adequacy decisions for post-Brexit data transfers from the UK, the Secretary of State for the DCMS and the ICO have agreed a MoU, recognising their roles and responsibilities in assessing the adequacy of countries. While the Secretary of State for the DCMS has responsibility for making adequacy decisions, she or he will consult the ICO before making such a decision and will take into account, but not be bound by, its representations.
The ICO and DCMS have jointly stated that the MoU “will ensure the ICO’s position as an independent regulator is not impacted by its role in adequacy assessments while setting out key principles that will continue our strong working relationship.”
Council adopts position on the proposed ePrivacy Regulation
The Council of the European Union agreed a mandate on 10 February for it to negotiate and finalise the terms of the proposed ePrivacy Regulation with the European Parliament and the European Commission. The ePrivacy Regulation was first approved by the European Commission in 2017 and so this seems to be a positive step forwards in a long-drawn-out process. Analysis of the differences between the Council's position and those of the European Parliament and Commission, dating from 2017, indicate differences over issues such as whether consent is required for cookies placed for certain non-essential purposes such as fraud prevention and whether electronic communications metadata may be processed by third parties where this is required by law.
The ePrivacy Regulation is intended to replace the existing ePrivacy Directive 2002/58 and will govern the protection of privacy and confidentiality of electronic communications services in the EU. Once adopted, the ePrivacy Regulation will have direct effect in all EU member states and although it will not automatically apply in the UK, the UK may well look at updating its e-privacy regime, which was also based on the 2002 Directive.
European Commission Vice President raises possibility of centralised data protection enforcement
On the topic of disputes at EU level, supervisory authorities were recently criticised by European Commission Vice President Vera Jourova for their “public squabbles”, who suggested that centralised data protection enforcement could lie ahead if cooperation could not be improved. Tension has recently been building in relation to the amount of time it has taken the Irish DPC to complete its ongoing probes into US tech companies, such as Amazon and Facebook, whose headquarters are located in Ireland. Last month, Ireland’s data protection commissioner, Helen Dixon, responded to concerns by saying that such criticism had been based on inaccurate and incomplete information. Jourova stated that “Such public squabbles don’t contribute to the creation of mutual trust, and I can only appeal to data-protection authorities to focus on the issues and improve their cooperation.” Jourova continued by asserting that if such cooperation is not possible, a more centralised model may have to be considered.
Proposed Amendments to DIFC Data Protection Law
The Dubai International Financial Centre Authority (“DIFC Authority”) recently launched a thirty-day consultation on its proposed amendments to the Data Protection Law, DIFC Law No.5 of 2020 (the “Data Protection Law”).
The proposed amendments to the Data Protection Law were outlined in Part A of the consultation paper. It is intended that such amendments would:
- make the judicial redress process for individuals clearer;
- improve and clarify the accountability requirements for controllers and processors; and
- review and improve the powers available to the Commissioner of Data Protection when reviewing a direction or determination of contravention of the Data Protection Law.
The deadline for submitting responses to the consultation paper was on 28 March 2021. The DIFC Authority will now review those comments and amend the proposal before enacting any changes. This marks the latest attempt by the DIFC Authority to update its data protection regime in light of recent international developments in this area.
CNIL issues guidance on the data protection implications related to the use of chatbots
On 19 February 2021, the French data protection authority, CNIL, issued guidance for firms looking to integrate chatbots within their services, while still remaining in compliance with data protection laws (the “Guidance”). The use of chatbots can have wide ranging data protection implications. For example, the CNIL refers to the fact that personal data will still be processed by controllers, even if users are not required to create accounts or identify themselves before using a chatbot.
While not applicable to those not established or operating in France, the following points may be of note to any organisation that is considering the data protection issues arising from the use of chatbots:
- The Guidance noted that, while the prior collection of GDPR-compliant user consent was one way to satisfy e-privacy requirements, it was not always necessary. The Guidance stated that prior consent would not be required if the chatbot was activated by a user before the cookies were deposited. In such cases, where there had been an express user request, there would be no need for prior consent if the cookies were strictly necessary for the provision of an online communication service.
- The length of time allowed for the retention of data would depend on the reasoning behind the processing of personal data by the chatbot.
- Chatbots should not make any automated decisions that would have a significant impact on the data subject.
- If it is anticipated that the chatbot will process special categories of data, the data processing must fall under one of the exceptions in Article 9(2) of the GDPR. In addition, a data protection impact assessment may be required.
Europe prepares for the handling of AI data related issues
The European Commission is expected to release its AI framework on 21 April 2021 (the “Framework”). It is anticipated that the Framework will significantly increase the role of the national data protection authorities (“DPAs”) in enforcing this area. While regulators have already started to offer advice on certain aspects of AI, concerns have been raised that the additional enforcement role would place a significant strain on the already stretched resources of DPAs. One way to relieve the pressure on DPAs may be to establish a European AI oversight body to assist the DPAs in responding and adapting to the upcoming Framework.
While on the topic of AI, those in the insurance sector should note that European bodies are taking proactive steps to ensure that AI is delivered in a “responsible and ethical” manner. To this effect, a joint declaration was signed by UNI Europa Finance, Insurance Europe, the European Federation of Insurance Intermediaries and the Association of Mutual Insurers and Insurance Cooperatives in Europe. The agreement stated that transparency requirements are key when insurance companies, workers and trade unions are processing data using AI.
Cyber Security
The EBA has been forced offline by the Microsoft Exchange hack
The EBA was forced to take its systems offline after discovering that its servers were compromised as part of a global Microsoft Exchange cyber-attack. The personal data held on those servers may have been accessed as a result of this cyber-attack. The EBA will keep its entire email system offline while it assesses the damage. In a press release, it was stated that “The EBA is working to identify what, if any, data was accessed.”
It has been reported that the attackers exploited vulnerabilities in Microsoft’s Exchange email system, thus putting major businesses and governments who use the system at risk. While the extent of the attack has not yet been established, it has been estimated that some 30,0000 US organisations may have been affected.
Microsoft believes that Hafnium, a Chinese state-sponsored attacker is responsible, however, this has not been confirmed, nor is it clear what the motives were behind the attack.
Regulatory enforcement
ICO issues fines totalling £330,000 to companies sending spam texts
Two firms have received fines totalling £330,000 from the ICO for sending nuisance text messages, in breach of regulation 22 of the PECR, during the Covid-19 pandemic:
- Leads Works Ltd ("LWL") received a £250,000 fine for sending over 2.6 million nuisance text messages between May and June 2020; and
- Valca Vehicle Ltd ("VVL") received a £80,000 fine for sending 95,000 nuisance text messages between June and July 2020.
In addition to being fined, both LWL and VVL were issued with Enforcement Notices, ordering them to stop sending the nuisance text messages.
ICO issues Enforcement Notice to food company relying on the "soft opt-in" under PECR for electronic marketing
Muscle Foods Limited ("MFL") has received an Enforcement Notice from the ICO after transmitting over 142 million unsolicited communications (made up of emails and SMS messages) for the purposes of direct marketing, in contravention of regulation 22 of PECR.
MFL did not have consent to send these marketing communications, and was unable to rely on the "soft opt-in" under regulation 22(3) PECR as it had failed to give the recipients of the offending communications an opportunity to refuse to permit their contact details to be used for the purposes of direct marketing when their details were collected.
International transfers: Bavarian DPA finds unlawful use of the newsletter tool: Mailchimp
The Bavarian DPA ("BLDA") has held that the transfer of subscribers’ email addresses from a German company to the US was unlawful under Article 44 GDPR, applying the ECJ’s findings in Schrems II (C-311/18).
FOGS Magazin, Munich (“FMM”) transferred the email addresses to the US-based company The Rocket Science Group LLC, the provider of the email marketing platform Mailchimp. FMM had EU standard data protection clauses (“SCCs”) in place between themselves and The Rocket Science Group LLP.
However, as the ECJ made clear in Schrems II, SCCs do not necessarily, in and of themselves, provide adequate protection for personal data. Instead, organisations which propose to export data to importers outside the EEA and adequate jurisdictions must ensure that the jurisdiction in which the importer is based provides safeguards that are equivalent to those under the GDPR, taking into account the domestic law of the importer. Where this is not the case (as the CJEU considered applied in respect of the US in Schrems II), supplementary measures are needed to remedy this. If an organisation cannot implement effective supplementary measures, then it must cease any transfer if it is ongoing, or notify the competent authority.
Here, the BLDA found that there were "at least indications that Mailchimp may in principle be subject to data access by U.S. intelligence services" (translated by machine from the German original). Accordingly, FMM should only have transferred the data if it implemented supplementary measures (falling into three broad categories: technical, contractual and organisational). FMM had not assessed whether such measures were necessary (let alone implemented them).
The BLDA did not impose any further action on FMM following a declaration that it will stop using Mailchimp. However, the decision emphasises the need for organisations to carefully assess their transfer mechanisms in light of the decision in Schrems II. A sensible starting point for this assessment is the two sets of guidance issued by the EDPB in November last year, accessible here and here.
Transfers to EU-based processors with US parent companies: French court decision on applicability of Schrems II
In a further decision in relation to Schrems II, the Conseil d'État, France's Supreme Court for administrative justice, held that Doctolib, a French company which provides a platform to book vaccine appointments, had not breached GDPR in using AWS Sarl (“AWSS”), a Luxembourg subsidiary of the US-based Amazon Web Services Inc, to provide hosting services to Doctolib.
The Claimants argued that AWSS, as a subsidiary of a US company, was at risk of being required by US authorities to provide personal data which it held, and such data would not therefore be adequately protected.
The Conseil d'État acknowledged this risk, which is notable as the data was being sent to a French company and held on servers in France, and shows that Schrems II transfer risk factors may be relevant even where personal data does not leave the EU, but may still be at risk through a third country parent company of a supplier.
However, the Conseil d'État held that it was sufficiently negated by various safeguards in place between Doctolib and AWSS, including: legal safeguards (such as contractual obligations in respect of challenging any access request by the US authorities), procedural safeguards (such as deletion of personal data after three months and the absence of any sensitive personal data being processed), and technical safeguards (such as the encryption of data hosted by AWSS).
This decision provides some reassurance that, where appropriate safeguards are put in place, simply having data hosted on servers owned by companies with US parents will not automatically lead to data being unlawfully exported under the GDPR.
Of course, if the data had been hosted in the US, the outcome may well have been different, and this serves to emphasise the need for organisations to: (i) properly understand how their data is hosted by service providers; and (ii) take into account relevant transfer risks attaching to service providers' group companies.
Spanish Data Protection Authority issues a record-breaking fine €8,150,000 on Vodafone Spain
We reported in our January bulletin that Caixabank, S.A. ("CSA") had received a record-breaking fine from the Spanish Data Protection Authority ("AEPD") totalling €6,000,000 for breaching Articles 6, 13 and 14 of the GDPR.
The decision came just a few months after the AEPD imposed a fine of €5,000,000 on Banco Bilbao Vizcaya Argentaria (“BBVA”) in December for similar breaches of the GDPR. In breach of Article 6 and 13 of the GDPR, BBVA had also failed to provide an appropriate mechanism to receive consent from data subjects and its privacy policy failed to specify the type of personal data processed, and the purpose and legal basis for processing.
Earlier this month it was revealed that Vodafone Spain ("VS") had received a fine from the AEPD totalling €8,150,000, made up of four smaller fines. The largest of the fines (totalling €4,000,000) principally related to VS, in breach of Article 28 GDPR, failing to provide: (i) sufficient guarantees as to the implementation of appropriate technical and organisational measures; and (ii) prior written authorisation in respect of the technical and organisational measures in place. Two of the smaller fines (of €2,000,000 each) related to failures in respect of international data transfers (in breach of Article 44 GDPR) and communications sent to customers who had previously objected to the processing of their personal data (in breach of various provisions of domestic Spanish law, one of which is referable to Article 21 GDPR). This fine is now the largest administrative fine issued by the AEPD to date.
Over the 11 months leading up to December the AEPD issued administrative fines totalling €2,800,000 but has since issued over €20,000,000 in fines, evidencing a marked uptick in its activity.
Large-scale data breach of health data investigated by the French DPA
The French DPA, CNIL, is investigating a large-scale data breach involving the health data of approximately 500,000 individuals. It is not clear how widely the file has been shared but the CNIL has asked Internet Service Providers to block access to a site while it continues its investigation. The personal data is reported to include patient's names, contact information, social security number, blood group, relevant health conditions (such as pregnancy), drug treatments and pathologies (including HIV status).
This breach serves to emphasise two contextual factors which are of particular importance to companies in the health sector. Firstly, the Covid-19 pandemic has increased the number of cyber-attacks targeting organisations in this sector. Secondly, the CNIL has announced that health data security is an enforcement priority in the future. Accordingly, it is important that companies operating in this sector take heed, and ensure that they have appropriate technical and organisational measures in place.
For a detailed analysis of the kind of steps which this may entail, a good starting point is our analysis of the Monetary Policy Notices issued by the ICO to British Airways, Marriott, and Ticketmaster, which can be found here.
Civil litigation
UK Government confirms no plans to introduce opt-out law for data protection claims
The DCMS has confirmed, perhaps unsurprisingly, that there be no change to the current regime to allow non-profit groups ("NPGs") to raise opt-out data protection claims on behalf of individuals without their permission. As things stand, individuals need to give their permission for NPGs to take actions (for example, by complaining to the ICO or commencing court proceedings) relating to data protection claims.
DCMS' approach was summarised as follows:
"The government has considered the arguments for and against implementing Article 80(2) of the UK GDPR which would permit non-profit organisations to represent individuals without their authority. The current regime already offers strong protections for individuals, including vulnerable groups and children, and routes for redress. In the government’s view, there is insufficient evidence of systemic failings in the current regime to warrant new opt-out proceedings in the courts for infringements of data protection legislation, or to conclude that any consequent benefits for data subjects would outweigh the potential impacts on businesses and other organisations, the ICO and the judicial system."
DCMS emphasised that, in reaching this view it was "wary of the risk of unintended consequences" and noted the views of business groups who say such a change "could increase litigation costs and insurance premiums during a period of economic uncertainty."
Tribunal supports ICO’s approach on the granularity of consent required for PECR compliance
The First-tier Tribunal (General Regulatory Chamber) recently determined in Koypo Laboratories Ltd v Information Commissioner (Allowed) [2021] UKFTT 2020_0263 that a monetary penalty issued by the ICO should be reduced from £100,000 to £80,000, and provided guidance on the consent required for compliance with the PECR.
The Tribunal considered that the ICO was correct to find that Koypo had instigated direct marketing by affiliates without consent in breach of PECR. In particular, the Tribunal noted, amongst other things:
- The Appellant was a partner of WRM Media (“WRM”) which had sent millions of unsolicited emails. The Tribunal agreed with the ICO that it was irrelevant that the Appellant did not send the emails itself; they were sent by WRM at its behest, and it was liable for ensuring they were sent lawfully;
- The Appellant was only named in a list of partners (alongside other “Claims Management” companies) which were found in a page hyperlinked to the fair processing notice of WRM where users could “fine tune” their consent options;
- On one of the WRM sites, within the privacy policy, there was a section detailing that the data subject is in control and that “registration forms will always have an unticked box for third party marketing, which you can optionally select to allow your personal data to be sent to our partners for direct marketing purposes” which gave the impression that a data subject would either receive no third party marketing or all of it; and
- This binary choice was also found on another site run by a partner of the Appellant. On this second site was one category “financial” which covered social security, insurance and financial institutions. Again, there was a choice of either no direct marketing or direct marketing from all of the companies within the category “financial”.
In addition to confirming the importance of specificity if consent is to be validly obtained, the judgment cited two decisions of the CJEU decisions with approval (Verbraucherzentrale Bundesverband eV v Planet49 GmbH [2020] 1 WLR 2248 and Orange Romania SA v ANSPDCP [2020] EUECJ C-61/19). This emphasises that the UK Courts will continue to apply European decisions in interpreting PECR and other relevant legislation which was in force prior to Brexit.