Data Protection update - November 2021

Data Protection update - November 2021

Welcome to our data protection bulletin, covering the key developments in data protection law from November 2021.

Data protection

Enforcement

Civil litigation

Data protection

New UK Information Commissioner

We said goodbye last month to the UK Information Commissioner Elizabeth Denham, as she reached the end of her active five-year term on 30 November 2021.

Since she took over from Christopher Graham in July 2016, Denham's office has dealt with the implementation of the GDPR; the UK leaving the EU; and the challenges brought by the global pandemic.  She has been instrumental in building frameworks and guidance in response to flagship data protection legislation and her work has been instrumental in focussing attention on the role data protection must play in innovation and digital acceleration, and the importance of organisations taking accountability for what they do with personal data.

John Edwards, who is the current Privacy Commissioner for New Zealand, has been approved as the next Information Commissioner and is expected to take up his position on 3 January 2022. Paul Arnold, the ICO's Deputy Chief Executive, will be designated as the ICO's Accounting Officer from 1 December 2021 until 2 January 2022.

EDPB issues guidance on international transfers.

The European Data Protection Board ("EDPB") has issued guidelines 05/2021 (the "Guidelines") on the interplay between Article 3 of the EU General Data Protection Regulation 2016/679 ("GDPR") and the provisions on international data transfers ("Transfers") under Chapter V of the GDPR. 

Following the outcome of Data Protection Commissioner v Facebook Ireland Limited & Maximillian Schrems (Case C-311/18) (Schrems II), a Transfer must be accompanied by a risk assessment of the Article 46 GDPR transfer tool used in that Transfer, commonly standard contractual clauses ("SCCs").

The Guidelines address confusion as to whether a transfer to an importer outside of the EU, which was already subject to the GDPR by virtue of the extraterritoriality of Article 3(2), was considered a Transfer which would require SCCs to be put in place, along with transfer risk assessments.

The EDPB's criteria for processing that qualifies as a Transfer are:

  1. The exporter (a controller or processor) has to be subject to the GDPR for that specific processing;
  2. The exporter discloses by transmission or otherwise makes the personal data subject to the processing of an importer (a controller or processor); and
  3. The importer is in a third country or is an international organisation, irrespective of whether the importer is subject to the GDPR in respect of the given processing in accordance with Article 3.

To explain the first criterion, a Transfer may take place whether or not a data exporter is established in the EU but the processing must be subject to the GDPR.  Article 3(1) and (2) say that the GDPR applies to:

  • processing by a controller or processor carried out in the context of the activities of an establishment of that controller or processor within the EU; or
  • processing of personal data by controllers or processors not established in the EU where the processing activity relates to the offering of goods or services or monitoring behaviour (to the extent that the behaviour is in the EU).

The second criterion requires that there must be a data controller or data processor actually making data available to another controller or processor.  Although this sounds straightforward, the Guidelines confirm that situations where an employee accesses personal data on their company database, while located in a third country, are excluded.

The final criterion requires that a data importer is located in a third country or in an international organisation, regardless of whether its importer's processing is subject to the GDPR.  This can produce some counter-intuitive results, as in the example given in the Guidelines:

a controller outside of the EU (Company A) offers goods and services to individuals in the EU.  Company A uses a processor located in France (Company B) to process the personal data and Company B sends the personal data back to Company A once it has completed the processing.  The processing by Company B is covered by the GDPR for processor-specific obligations under Article 3(1) as it is within the context of the activities of Company B's establishment in the EU.  The processing by Company A is also subject to the GDPR under Article 3(2) as it's offering goods and services to individual in the EU.  Once Company B finishes its processing and sends the finished product back to Company A (outside of the EU), a Transfer has been made.

In this example, steps would still need to be taken to comply with the conditions of Chapter V despite both parties being subject to the GDPR. However, the Guidelines mention that the safeguards applied should be bespoke to the situation; so a Transfer back to a controller would accordingly require less protection if that controller is already subject to the GDPR for the given processing.  The Guidelines hint that when developing relevant transfer tools (currently only available in theory) the Article 3(2) situation should be considered to avoid duplicating GDPR and to "address the elements and principles that are "missing" and, thus, needed to fill the gaps".  Specifically, these "gaps" relate to: conflicting national laws; government access; and challenges in enforcement and obtaining remedies against entities outside of the EU.  The Guidelines say the EDPB "encourages and stands ready to cooperate" in the development of a new set of SCCs for a situation where the importer is already subject to the GDPR via Article 3(2). We will update you on any progress in this regard.

Bruno Gencarelli statements at the IAPP congress

Bruno Gencarelli, the head of the international data flows and protection unit at the European Commission, gave a couple of key updates in his speech to the IAPP Europe Data Protection Congress in Brussels on 17 November 2021.

Firstly, he warned that the UK's proposed data framework reform (which we reported on in our October update here and in an article on post-reform data flows here) could have a "significant, substantive impact" on its data adequacy status with the EU.  He said that "If we find a system adequate, we find a system adequate looking at it in its entirety, from a global point of view.  But if certain fundamental, structural elements of that system are changed, significantly changed, it is no longer the same system, and that raises questions which may create concerns and raise issues.".  The UK's consultation on data protection reform has now closed.

Secondly, he said that there had been "big progress" on a data transfer agreement with the US but, that "it will be of no use… to conclude something that would not be durable" and specifically said that, in respect of the end of 2021 being a deadline for a transfer agreement, "it's an artificial timeline; it's a natural time to take stock.  If we are there by the end of the year, we will be happy to be there (…) but we are not there yet (…) and if it is that we need a little bit more time, we will take more time".  This agreement is anticipated to replace the Privacy Shield, which was invalidated by Schrems II as we reported here.

Data Trade Principles announced by G7

At the G7 Trade Track, on 22 October 2021, the G7 countries agreed a set of Digital Trade Principles aimed at promoting open digital markets, data free flow with trust, safeguards for workers, consumers and businesses, digital trading systems, and fair and inclusive global governance.  Of specific importance is the principle that personal data must be protected by high enforceable standards, including when it is transferred across borders.  The importance of cooperation on data protection was also highlighted along with common regulatory approaches.  The G7 identified data localisation (where personal data is required to be retained only within the country in which it was created) as a key concern where it is used for protectionist and discriminatory purposes.  This is due to data localisation having the potential to undermine open and democratic societies.  Interestingly, avoiding data localisation was a theme of the international talks by the OECD in July (as reported in our update here).

EDPB issues statement on proposed digital legislation

On 18 November 2021, the EDPB issued a statement on the Digital Services Package and Data Strategy (together, the ""Package").  The Package aims "to facilitate the further use and sharing of (personal) data between more public and private parties inside 'the data economy', to support the use of specific technologies such as Big Data and AI and to regulate online platforms and gatekeepers".  The EDPB has issued serious concerns about the implications of the Package on the fundamental rights and freedoms of individuals and society as a whole.  These concerns are:

  • The Package would allow the use of AI systems to categorize individuals using biometrics according to ethnicity, gender, political or sexual orientation;
  • The use of AI to infer emotions of a natural person raises highly undesirable consequences;
  • AI should be banned for automated facial recognition in publicly accessible spaces;
  • Interoperability requirements should be introduced to promote a more open digital environment; and
  • Online targeted advertising should be more strictly regulated and pervasive tracking-based targeted advertising should be prohibited along with the profiling of children.

The EDPB has requested further clarification to ensure existing data protection rules are not affected or undermined by the Package and that data protection rules will prevail where personal data is processed.

Research finds employee monitoring spiked in the last six months

The trade union Prospect found that one in three workers reported being monitored at work in October 2021, compared to one in four in April 2021.  It also reported the doubling in the use of web-cam monitoring over the last six months.

Prospect said monitoring was particularly likely to affect workers in sectors with "higher levels of remote working, larger proportions of younger workers, and low levels of trade union membership" which included tech workers.  The shadow digital minister of the UK's Labour Party, Chi Onwurah, commented that "new technology allows employers to have a constant window into their employees' homes, and the use of the technology is largely unregulated by government".  Following the research, Prospect has called upon the Information Commissioner's Office ("ICO") to ensure workers get a say in the introduction of workplace technology and to ensure transparency in the way those technologies are used. 

Our employment team published an article here on the risks of remote working and the extent to which employees could be monitored.

China Personal Information Protection Law enters into force

On 1 November 2021, China's Personal Information Protection Law ("PIPL") entered into force.  As we reported in our August update, the PIPL provides rules for the processing of personal and sensitive information as well as for personal information protection processors, data subject rights and onward transfers.

Breaches of the PIPL's provisions can result in penalties amounting to up to 5% of annual turnover of the offending organisation.

Belgium Data Protection Authority in hot water

On 12 November 2021, the European Commission (the "Commission") issued a 'reasoned opinion' to Belgium for failing to ensure the full independence of its Data Protection Authority (the "BDPA").  In particular, the BDPA has been accused of violating Article 52 of the GDPR which states that supervisory authorities must perform tasks and exercise powers independently. 

The reasoning behind the decision is that the Commission noted some members of the BDPA reported to management committees which took part in government projects on COVID-19 contact tracing or were members of the information security commission, depending on the Belgian government.  The announcement followed a letter of formal notice, sent on 9 June 2021, after which the DPBA had two months to take corrective measures.  However, the members have remained in their posts.  The BPDA now has two months to take the relevant action before the Commission can refer the case to the Court of Justice of the European Union.

UK considering divergence from EU NIS threshold reporting

The UK Government has set out its response to the consultation issued in respect of incident reporting thresholds for relevant digital service providers ("DSPs") in the scope of the Network and Information Systems ("NIS") Regulations 2018 and the European Commission Implementing Regulation 151/2018 (the "Implementing Regulation").  For the purpose of the NIS Regulations, the relevant digital service providers include: online search engines; online marketplaces; and cloud computing services.  At present, the threshold for DSPs to report a NIS incident is set for a level appropriate to an EU market.  The Government's view, prior to the consultation, was that the threshold for DSPs to report was no longer appropriate following the UK's exit from the European Union. 

As we reported in our October update here, at present a DSP must consider the following factors when determining whether the impact of an incident is substantial:

  • The number of users affected by an incident, including users relying on the service for their own services;
  • Duration of the incident;
  • Geographical impact of the incident;
  • Extent of disruption of functioning of the service; and
  • Extent of impact on economic and societal activities

In addition, the DSP must identify a ground under Article 4 of the Implementing Regulation, which provides situations where the incident will be considered substantial.  These include: the service being unavailable for over five million user-hours; the incident has resulted in the loss of integrity, authenticity or confidentiality of processed data affecting over 100,000 users; the incident has created a risk to public safety or loss of life; or the incident has caused material damage of over a million EUR.

The proposal under consultation would remove the requirement for an Article 4 ground in favour of an incident reporting threshold provided by the ICO in forthcoming guidance.  The responses were generally positive or neutral to the proposals and some areas of concern were reported.  The most frequent reason for concern was that the ICO should not have the power to amend the thresholds without prior consultation.  The Government's response to that was that the ICO had a commitment to regular engagement with industry, demonstrated by the ICO's own consultation for the proposed reporting thresholds which launched on 10 September 2021.  Another key concern was that amending the threshold would diverge the NIS reporting away from the EU reporting level.  The Government acknowledged this was a risk but described it as "necessary" as the current thresholds were "not fit for purpose".

NCSC reports record cyber attacks

The National Cyber Security Centre ("NCSC") has said in its annual review ("Review") that it handled a record 777 incidents between August 2020 and September 2021.  The Review confirmed that around 20% of the organisations supported were linked to the health sector and vaccines.  One of the specific incidents discussed by the Review was an attack on the University of Oxford while it was working on vaccine research crucial to the rollout of the national immunisation programme. 

The health sector had previously been engaged by the NCSC following the WannaCry ransomware attack in 2017.  Since that outbreak, tens of thousands of indicators of compromise were shared to enable mitigating action by IT leads across the health sector.  One example of NCSC methods is the expansion of the protective domain name system used as a hard prevention to domains which are known to be malicious through not resolving access to the site.  In terms of the scale of the impact this action had, between January 2020 and July 2021, this system blocked access to malicious domains 4.4 billion times. 

Enforcement

Clearview AI Inc has been issued a provisional fine of over £17 million

Clearview AI Inc ("Clearview"), which is headquartered in New York, utilises facial recognition technology to support various law enforcement and government agencies, both in the US and UK.  Its services are based on a database of images which the ICO considered to be "likely to include the data of a substantial number of people from the UK and may have been gathered without people's knowledge from publicly available information online, including social media platforms".

A joint investigation by the ICO and the Office of the Australian Information Commissioner established that Clearview had, and was continuing, to obtain facial images and process them for a profit on behalf of hundreds of clients, in violation of, amongst other things, UK data protection laws.

The ICO's initial view is that Clearview has failed to comply with UK data protection laws by: (1) failing to process information of people in the UK fairly; (2) failing to have a process in place to stop the data being retained indefinitely; (3) failing to have a lawful reason for collecting the information; (4) failing to meet the higher data protection standards required for biometric data (i.e. special category data pursuant to the GDPR and UK GDPR); (5) failing to inform people about what happens to their data; and (6) requesting additional personal information to include photos, which may have instead been a disincentive to individuals who wished to object to their data being processed.  Consequently, the ICO imposed a provisional fine as well as a provisional notice to cease further processing of personal data of individuals in the UK.

Clearview now has the opportunity to respond to the allegations by way of representations, which will be considered by the ICO before a final decision and Monetary Penalty Notice is made.

ICO fines the Cabinet Office £500,000 in respect of a data breach relating to New Year Honours recipients.

On 27 December 2019, the Cabinet Office published a file which contained the list of names and unredacted addresses of over 1,000 individuals who were on the New Year's Honours list (the "File").  The File was published on the government website – GOV.UK, and although the Cabinet Office promptly removed the File soon after becoming aware of the breach, the File was still cached and those who had the exact web address could still locate the File.  It was indicated that the data was available online for 2 hours and 21 minutes and accessed 3,872 times.  Thereafter, the ICO received complaints and concerns over personal safety from 30 individuals.

The ICO issued the Cabinet Office with a Monetary Penalty Notice fining it £500,000 for its contraventions of Articles 5(1)(f) and 32(1) of the GDPR (i.e. for failing to have appropriate technical and organisational measures in place to prevent the breach and remedy it appropriately).  Amongst other things, the Cabinet Office's: IT system was set up incorrectly by the Cabinet Office, which meant that the system generated a CSV file that included postal address data; operations team decided to amend the File instead of modifying the IT system meaning postal address data was automatically included in the File; and had no specific or written process in place to sign off documents and content containing personal data prior to being sent for publication.

WhatsApp granted permission to judicially review €225 million fine

As previously reported, the Irish Data Protection Commission ("DPC") imposed a fine of €225million on WhatsApp for breaches of Articles 5(1)(a), 12, 13 and 14 of the GDPR.  In addition to the imposition of an administrative fine, the DPC imposed a reprimand on WhatsApp Ireland Limited requiring it to bring its processing into compliance by taking a range of specified remedial actions.  In particular, it was required to bring its processing activities into compliance within three months and update its privacy notices for both users and non-users to include the information required under Articles 13 and 14 of the GDPR as identified in the DPC's decision.

WhatsApp has now secured permission from a High Court judge to pursue a judicial review of the DPC's decision.  WhatsApp is arguing that the DPC exercised its powers in an unconstitutional manner and that certain provisions of the Data Protection Act 2018 are incompatible with Ireland's obligations under the European Convention on Human Rights.

Notwithstanding its challenge to the DPC's decision, WhatsApp has amended its privacy policy which, in WhatsApp's view, provides "more information on what exactly is done" with users' information and the interplay between WhatsApp and its parent company, Meta, as well as Facebook and Instagram.  However, to date WhatsApp has made "no changes to the way in which data is processed, used or shared".

Garante prohibits data transfers and fines University for failings in retaining a US-based processor and cloud storage

In light of complaints by affected students, the Italian data protection authority ("Garante") imposed a fine of €200,000 on Luigi Bocconi University for using a US proctoring software, Respondus, to invigilate examinations remotely during the COVID-19 pandemic, in breach of a number of provisions of GDPR.

Garante considered that, amongst other things, Luigi Bocconi University:

  • Had engaged in the processing of biometric data and the performance of profiling activities without a proper legal basis (the software tracked students' behaviour during the test, audio and video recordings were taken of the test and a photograph was also taken at the beginning of the test), in breach of Articles 5(1)(a), 9, 12 and 13;
  • Had processed information which was unnecessary and exceeded what was necessary for examination purposes, and that there were shortcomings in the University's Data Protection Impact Assessment, in breach of Article 5(1)(c), 25 and 35 of the GDPR; and
  • Had violated Article 28 and the conditions under Chapter V of the GDPR, by transferring data to a third country, the US, without adequate assurances that the transfers had been carried out in compliance with the conditions imposed by the GDPR.

Garante fines over €3 million for allegedly misusing customer data to make unwanted promotional calls

Garante also imposed a fine of nearly €3.3million on Sky Italia ("Sky") for numerous breaches of GDPR.

Dozens of customers complained about the receipt of unwanted and non-consensual phone calls promoting Sky's services both directly and through third-party call centres.  It was alleged that Sky had also utilised unverified lists obtained from other companies.  In addition to the fine, Sky Italia was ordered to cease further processing for promotional and commercial purposes which were carried out through third-party lists obtained in the absence of consent. 

Norwegian Data Protection Authority proposes a fine of €409,768 against a municipality for breaches of Art 5(1) GDPR, Art 24 GDPR and Art 32 GDPR after a serious ransomware attack

In early 2021, a Norwegian municipality, Østre Toten commune, fell victim to a ransomware attack which resulted in employees being locked out of key IT systems, encryption of data and deletions of backups.  It was estimated that about 160GB of data was taken from the systems with about 2,000 documents later discovered to be on sale on the dark web.

Technical investigations revealed severe deficiencies in the municipality's IT systems and processes, which included: unsecured back-ups; lack of multi-factor authentication; and lack of proper log management.

Though the municipality notified the Norwegian DPA of the breach, the Norwegian DPA found that the municipality had: employed insufficient means of protecting personal data; and had insufficient internal controls in place; in breach of Articles 5(1)(f), 24 and 32 GDPR, and that these breaches merited a fine of €409,768.

The Luxembourg Data Protection Agency fined an unnamed company €13,200 for two violations of the GDPR that it identified in the course of an audit on a DPO's role within the company

The Luxembourg Data Protection Agency ("CNPD") launched a wide-reaching audit campaign on the role and function of Data Protection Officers ("DPO") with 11 audit objectives.  The CNPD consequently carried out 28 audits within various organisations, during which it was revealed that an unnamed company (the "Company") failed to comply with several obligations pertaining to a DPO's appointment and role.  This particularly included: (1) a violation of the obligation to appoint the DPO based on his/her professional qualities, a breach of Article 37(5) GDPR; (2) a violation of the obligation to involve the DPO in all matters related to the protection of personal data, a breach of Article 38(1) GDPR; (3) a violation of the obligation to provide the DPO with necessary resources, a breach of Article 38(2) GDPR; and (4) a violation of the obligation to ensure that the DPO monitors compliance with the GDPR, a breach of Article 39(1) GDPR.

Although the audit report identified four violations, only two resulted in a fine because measures were already being taken to remedy the breaches of Articles 38(1) and 39(1)(b) GDPR.  In relation to the breach of Article 38(1), the Company took steps to formalise the DPO's presence in the Company's structural consultative bodies and undertook a review of the internal procedures and policies to ensure that all bases were covered.  In remedying the breach of Article 39(1)(b), the Company demonstrated that there were internal policies in place regarding the processing of personal data and that those policies were also frequently reviewed.

Civil litigation

Lloyd v Google LLC [2021] UKSC 50 - Supreme Court finds claim for compensation under the Data Protection Act 1998 (the "DPA") cannot proceed on "opt-out basis"

In a landmark decision, the Supreme Court unanimously agreed that Richard Lloyd's claim against Google for breach of his (and of those 4 million other Apple iPhone users) data protection rights under s.13 DPA should not proceed.

In a judgment which will have a profound effect on collective redress, both in the context of data protection litigation and more generally, and which will be welcomed by data controllers and by the cyber-insurance industry, the Supreme Court overturned the decision of the Court of Appeal (summarised here).  Restoring Warby J's finding at first instance, it held that damages for breach of the DPA were not actionable without proof of financial loss or distress.  While the Supreme Court disagreed with Warby J's characterisation of the claim as "officious litigation", accepting that representative actions under CPR 19.6 were available in damages claims, it held that such actions could not proceed where there was no evidence of damage to any Claimant, still less the entire class.

We have produced a more detailed note and analysis of the case here

Since judgment was handed down, there have been further relevant developments:

  • The solicitors acting for the Claimants in the representative claim against TikTok, which is pursued under GDPR as opposed to under the DPA, have confirmed that their client intends to pursue their claim, which will entail that there will be a determination as to whether claims of a similar nature to that pursued in Lloyd can be pursued under GDPR; by contrast to the DPA; and
  • DCMS has given an indication that, notwithstanding that its recent decision that there is no need for further legislation to create further avenues for data subjects to obtain redress was based on the fact that at the time the Court of Appeal's decision was good law, it did not intend to reappraise its decision in light of the Supreme Court's judgment.

The English Court provides substantial guidance regarding its approach to de minimis claims pursuant to GDPR

Three recent first instance decisions, all of which stemmed from minor data breaches, have provided welcome clarity of the English courts' approach to minor data breaches.

Rolfe v Veale Wasbrough Vizards [2021] EWHC 2809 (QB)

The Claimants sought damages arising from a firm of solicitors misdirecting an email, which was deleted by the recipient immediately thereafter as we reported on our data protection law hub here.  Master McCloud gave the Claimants claim short shrift, and dismissed it summarily and awarded the Defendant its costs on the indemnity basis, noting:

"We have a plainly exaggerated claim for time spent by the Claimants dealing with the case and a frankly inherently implausible suggestion that the minimal breach caused significant distress and worry or even made them 'feel ill'.  In my judgment no person of ordinary fortitude would reasonably suffer the distress claimed arising in these circumstances in the 21st Century, in a case where a single breach was quickly remedied.

There is no credible case that distress or damage over a de minimis threshold will be proved.  In the modern world it is not appropriate for a party to claim, (especially in the High Court) for breaches of this sort which are, frankly, trivial…the law will not supply a remedy in cases where effectively no harm has credibly been shown or be likely to be shown."

Johnson v Eastlight [2021] EWHC 3069

The Claimant was a party mentioned in a document that was inadvertently disclosed, but was deleted by the recipient with the same alacrity as occurred in Rolfe.  As in Rolfe the Defendant applied for the Claimant's claim to be summarily dismissed.

Master Thornett was clearly unconvinced as to the merits of the Claimant's claim (noting that their information was "not of an obviously sensitive nature in itself" and the Claimant had in fact publicised the relevant information by issuing the proceedings without seeking to seal the Court file).  However, he considered that it marginally passed the relevant test, it should be remitted to the County Court.  He emphasised that the claim should never have been pursued in the High Court, noting:

"No serious privately paying litigant would contemplate spending over £50,000 in costs, not all of which may prove recoverable even in the event of success, and similarly expose themselves to the risk of a significant adverse costs order following High Court litigation if unsuccessful, for a damages claim less than £3,000.  The presentation and processing of this case to-date in this forum has, I am satisfied, constituted a form of procedural abuse".

Ashley v Amplifon Ltd [2021] EWHC 2921 (QB)

The Claimant sought damages arising from the inadvertent disclosure of their employment contract to another employee.  Whilst, unlike Rolfe, the Claimant's claim survived strike-out and was remitted to the County Court for determination, it did so only marginally, with Kerr J noting that the claim may not be worth the candle:

"Access to justice includes the right to litigate modest claims for amounts that may seem trivial to lawyers but are not to the party seeking not just the money but to vindicate their rights.  Whether the claim is worth the candle must be seen in that light."

These decisions, like the Supreme Court's Lloyd, are reflective of the fact that the English Court is not tolerant of small claims, with limited evidence of loss, being pursued before the High Court (which became somewhat of a trend following the Court of Appeal's decision in Lloyd), to the extent that they are to be pursued at all.

Given these decisions, and Saini J's recent decision in Warren (as to which see here), the torrent of these claims, which had been deluging the High Court, is likely now to slow to more of a manageable trickle.

German court asks CJEU to clarify whether calculating consumer credit scores falls within the scope of automated decision-making under GDPR.

The Administrative Court of Wiesbaden has referred two questions to the CJEU regarding the scope of protections afforded under Article 22(1) GDPR pertaining to automatic decision-making and profiling in the context of calculating an individual's credit score.

This referral arises from an unnamed individual's claim against a private credit report agency, SCHUFA Holding AG ("SCHUFA"), following a bank's refusal to provide the individual with a bank loan on the basis of a low score provided by SCHUFA.  The individual requested SCHUFA to provide her with information concerning her circumstances as well as deletions of certain entries.  SCHUFA provided the individual with information on her score and on the functioning of its score calculation but did not disclose the factors accounted for when the score was calculated and the weighting of each factor.  The individual then unsuccessfully complained to the Hessian Data Protection Authority ("Hessian DPA").  The Hessian DPA reasoned that SCHUFA did not need to disclose how the scores were calculated and reasoned that they were therefore in compliance with the German Federal Data Protection Act. The individual thereafter commenced court process.

  •