Subscribe to our InfoBytes Blog weekly newsletter and other publications for news affecting the financial services industry.
On November 10, the UK Supreme Court issued a judgment in an appeal addressing whether a claimant can bring data privacy claims in a representative capacity against a global technology company in a class action suit. The claimant sought compensation on behalf of a class under section 13 of the Data Protection Act 1998 (DPA 1998) for damages suffered when the tech company allegedly tracked millions of iPhone users’ internet activity in England and Wales over a period of several months between 2011 and 2012, and used the collected data without users’ knowledge or consent for commercial purposes. The DPA 1998 was replaced by the UK General Data Protection Regulation and the Data Protection Act 2018 but was in force at the time of the alleged breaches and is applicable to this claim, the Court explained in a press summary. The Court also noted that, except in antitrust cases, UK legislation does not allow class actions and Parliament has not yet legislated to establish a class action regime related to data protection claims. The Court noted that the claimant sought to use “same interest” precedent, which allows a claim to be brought “by or against one or more persons who have the same interest as representatives of any other persons who have that interest.”
The Court reasoned that the case was “doomed to fail” because “the claimant seeks damages under section 13 of the DPA 1998 for each individual member of the represented class without attempting to show that any wrongful use was made by [the tech company] of personal data relating to that individual or that the individual suffered any material damage or distress as a result of a breach of the requirements of the Act by [the tech company].” The Court added that users’ “loss of control” over personal data did not constitute “damage” under section 13 of the DPA 1998 because the users were not shown to have lost money or suffer distress. If the case had been allowed to proceed, the tech company could have faced a £3 billion damages award.
On September 2, the Irish Data Protection Commission (Commission) announced that a final decision was reached in a General Data Protection Regulation (GDPR) investigation into a U.S.-based messaging service’s handling of individuals’ personal information. The final Article 65 decision, published by the European Data Protection Board (EDPB), imposes a €225 million on the company, and resolves an investigation into whether the company met its transparency obligations with respect to its data processing activities. The Commission alleged that the company violated provisions of the GDPR through the way it processed users’ and non-users’ data, as well as in the way it processed and shared data with other companies’ owned by the parent global social media company.
According to the final decision, “a number of concerned supervisory authorities” raised objections to aspects of the draft decision, taking issue, among other things, with the size of the proposed fine, which was originally set between €30 and €50 million. Because the Commission was unable to reach a consensus with the objecting concerned supervisory authorities, a dispute resolution process was triggered. The EDPB ultimately ordered the Commission to reassess and increase its proposed fine. In addition to imposing the administrative fine, the Commission also ordered the company “to bring its processing into compliance by taking a range of specified remedial actions.”
Recently, a global technology corporation disclosed a $746 million euro (approximately $888 million USD) fine issued by the Luxembourg National Commission for Data Protection (CNPD) for alleged violations of the EU’s General Data Protection Regulations (GDPR). The corporation’s Form 10-Q for second quarter 2021 states that on July 16, the CNPD issued a decision against the corporation’s European headquarters, claiming its “processing of personal data did not comply with the [GDPR].” In addition to the fine, the decision also requires corresponding practice revisions, the details of which were not disclosed. The corporation noted that the decision is “without merit” and stated it intends to defend itself “vigorously” in this matter. According to sources, the decision follows an investigation started in 2018 when a French privacy group claiming to represent the interests of Europeans filed complaints against several large technology companies to ensure European consumer data is not manipulated for commercial or political purposes.
On December 15, the Irish Data Protection Commission (Commission) announced a final decision was reached in a General Data Protection Regulation (GDPR) investigation into a U.S.-based social networking tech company’s actions related to a 2019 data breach that affected users across the European Union. The final decision, published by the European Data Protection Board (EDPA), imposes a €450,000 fine against the company, and resolves an investigation in which the Commission alleged the company violated Articles 33(1) and 33(5) of the GDPR by failing to provide notice about the breach within a 72-hour period and by neglecting to adequately document the breach. According to the Commission, this inquiry is the first “dispute resolution” Article 65 decision (draft decision) under the GDPR, and marks the first decision issued against a “big tech” company. According to the final decision, “a number of concerned supervisory authorities raised objections” to aspects of the draft decision, taking issue, among other things, with the size of the proposed fine, which was originally set between €135,000 and €275,000. The EDPA determined that the objections were “relevant and reasoned” and instructed the Commission to increase the fine to ensure “it fulfils its purpose as a corrective measure and meets the requirements of effectiveness, dissuasiveness and proportionality” established under the GDPR.
On September 16, the U.S. Senate Committee on Commerce, Science, and Transportation announced it will convene a hearing on September 23 to “examine the current state of consumer data privacy and legislative efforts to provide baseline data protections for all Americans.” The hearing will also examine the lessons learned from the EU’s Global Data Protection Regulation and recently enacted state privacy laws, along with the data privacy impacts from Covid-19.
The current slate of key witnesses include a number of former chairmen and commissioners of the FTC.
On August 5, the FTC Commissioners testified before the Senate Committee on Commerce, Science, and Transportation and discussed, among other things, the agency’s continued enforcement of the EU-U.S. Privacy Shield, despite the recent Court of Justice of the European Union (CJEU) invalidation of the framework, and their interest in federal data privacy legislation. As previously covered by InfoBytes, in July, the CJEU determined that because the requirements of U.S. national security, public interest and law enforcement have “primacy” over the data protection principles of the EU-U.S. Privacy Shield, the data transferred under the EU-U.S. Privacy Shield would not be subject to the same level of protections prescribed by the EU General Data Protection Regulation, and thus, declared the EU-U.S. Privacy Shield invalid.
In his opening remarks, Commissioner Simons emphasized that the FTC will “continue to hold companies accountable for their privacy commitments, including privacy promises made under the Privacy Shield,” which the FTC has also noted on its website. Additionally, Simons urged Congress to enact federal privacy and data security legislation, that would be enforced by the FTC and give the agency, among other things, the “ability to seek civil penalties” and “targeted [Administrative Procedures Act] rulemaking authority to ensure that the law keeps pace with changes and technology in the market.” Moreover, Commissioner Wilson agreed with a senator’s proposition that the enactment of a preemptive federal privacy framework would make “achieving a future adequacy determination by the E.U. easier.”
Court of Justice of the European Union invalidates EU-U.S. Privacy Shield; standard contractual clauses survive (for now)
On July 16, 2020, the Court of Justice of the European Union (CJEU) issued its opinion in the Schrems II case (Case C-311/18). In its opinion, the CJEU concluded that the Standard Contractual Clauses issued by the European Commission for the transfer of personal data to data processors established outside of the EU are valid. However, the Court invalidated the EU-U.S. Privacy Shield. The ruling cannot be appealed.
In 2015, a privacy campaigner named Max Schrems filed a complaint with Ireland’s Data Protection Commissioner challenging a global social media company’s use of data transfers from servers in Ireland to servicers in the U.S. Schrems argued that U.S. laws did not offer sufficient protection of EU customer data, that EU customer data might be at risk of being accessed and processed by the U.S. government once transferred, and that there was no remedy available to EU individuals to ensure protection of their personal data after transfer to the U.S. Schrems sought the suspension or prohibition of future data transfers, which were executed by the company through standard data protection contractual clauses (a method approved by the Court in 2010 by Decision 2010/87). The social media company had utilized these standard contractual clauses after the CJEU invalidated the U.S. – EU Safe Harbor Framework in 2015.
Following the complaint, Ireland’s Data Protection Commissioner brought proceedings against the social media company in the Irish High Court, which referred numerous questions to the CJEU for a preliminary ruling, including questions addressing the validity of the standard contractual clauses and the EU-U.S. Privacy Shield.
CJEU Opinion – Standard Contractual Clauses (Decision 2010/87)
Upon review of the recommendations from the CJEU’s Advocate General published on December 19, 2019, the CJEU found the Decision approving the use of contractual clauses to transfer personal data valid.
The CJEU noted that the GDPR applies to the transfer of personal data for commercial purposes by a company operating in an EU member state to another company outside of the EU, notwithstanding the third-party country’s processing of the data under its own security laws. Moreover, the CJEU explained that data protection contractual clauses between an EU company and a company operating in a third-party country must afford a level of protection “essentially equivalent to that which is guaranteed within the European Union” under the GDPR. According to the CJEU, the level of protection must take into consideration not only the contractual clauses executed by the companies, but the “relevant aspects of the legal system of that third country.”
As for the Decision 2010/87, the CJEU determined that it provides effective mechanisms to, in practice, ensure contractual clauses governing data transfers are in compliance with the level of protection requirement by the GDPR, and appropriately requires the suspension or prohibition of transfers in the event the clauses are breached or unable to be honored. The CJEU specifically highlighted the certification required by the EU data exporter and the third-party country recipient to verify, prior to any transfer, (i) the level of data protection in the third-party country prior to any transfer; and (ii) abilities to comply with the data protection clauses.
CJEU Opinion - EU-U.S. Privacy Shield, (Decision 2016/1250)
The CJEU decided to examine and rule on the validity of the EU – U.S. Privacy Shield. The CJEU determined that because the requirements of U.S. national security, public interest and law enforcement have “primacy” over the data protection principles of the EU-U.S. Privacy Shield, the data transferred under the EU-U.S. Privacy Shield would not be subject to the same level of protections prescribed by the GDPR. Specifically, the CJEU held that the surveillance programs used by U.S. authorities are not proportionally equivalent to those allowed under the EU law because they are not “limited to what is strictly necessary,” nor, under certain surveillance programs, does the U.S. “grant data subjects actionable rights before the courts against the U.S. authorities.” Moreover, the CJEU rejected the argument that the Ombudsperson mechanism satisfies the GDPR’s right to judicial protection, stating that it “does not provide any cause of action before a body which offers the persons whose data is transferred to the United States guarantees essentially equivalent to those required by [the GDPR],” and the Ombudsperson “cannot be regarded as a tribunal.” Thus, on those grounds, the CJEU declared the EU-U.S. Privacy Shield invalid.
On July 19, the United Kingdom’s Information Commissioner’s Office (ICO) issued a £80,000 fine against a London-based real estate management company for allegedly leaving over 18,000 customers’ personal data exposed for almost two years. According to the ICO, when the company transferred personal data from its server to a partner organization, the company failed to switch off an “anonymous authentication” function, which exposed all the data—including personal data such as bank statements, salary details, copies of passports, dates of birth, and addresses—stored between March 2015 and February 2017. The ICO alleges that the company failed to take appropriate technical and organizational measures to protect customers’ personal data and concluded the failures were “a serious contravention of the 1998 data protection laws which have since been replaced by the [General Data Protection Regulation] GDPR and the Data Protection Act 2018.”
On July 8 and 9, the United Kingdom’s Information Commissioner’s Office (ICO) issued two notices of its intention to fine companies for infringements of the General Data Protection Regulation (GDPR). On July 8, the ICO announced it intended to fine a U.K.-based airline £183.39M for a September 2018 cyber incident, which, due to “poor security arrangements,” allowed attackers to divert user traffic on the airline’s website to a fraudulent site, making consumer details accessible. The airline notified the ICO about the incident, which compromised the data of approximately 500,000 consumers, and has cooperated with the ICO in the investigation and made improvements to its security arrangements. Additionally, on July 9, the ICO announced it intended to fine a multinational hotel chain £99,200,396 for failing to undertake sufficient due diligence when the chain purchased a hotel group in 2016, which had previously exposed 339 million guest records globally in 2014. The exposure was discovered in 2018, and the hotel chain thereafter reported the incident to the ICO, and has cooperated with the investigation and made improvements to its security arrangements. In both announcements, the ICO notes that it will, “consider carefully the representations made by the company and the other concerned data protection authorities” before issuing the final decision.
On June 27, the FTC held its fourth annual PrivacyCon, which hosted research presentations on a wide range of consumer privacy and security issues. Following opening remarks by FTC Chairman Joseph Simons, the one-day conference featured four plenary sessions covering a number of hot topics:
- Session 1: Privacy Policies, Disclosures, and Permissions. Five presenters discussed various aspects of privacy policies and notices to consumers. The panel discussed current trends showing that privacy notices to consumers have generally become lengthier in recent years, which helps cover the information regulators require, but often results in information overload for consumers more generally. One presenter advocated the concept of a condensed “nutrition label” for privacy, but acknowledged the challenge of distilling complicated activities into short bullets.
- Session 2: Consumer Preferences, Expectations, and Behaviors. This panel addressed research concerning consumer expectations and behaviors with regard to privacy. Among other anecdotal information, the presenters noted that many consumers are aware that personal data is tracked, but consumers are generally unaware of what data collectors ultimately do with the personal data once collected. To that end, one presenter advocated prescriptive limits on data collection in general, which would take the onus off consumers to protect themselves. Separately, with regard to the Children’s Online Privacy Protection Act (COPPA), one presenter noted that the law generally aligns with parents’ privacy expectations, but the implementing regulations and guidelines are too broad and leave too much room for implementation variations.
- Session 3: Tracking and Online Advertising. In the third session, five presenters covered various topics, including privacy implications of free versus paid-for applications to the impact of the EU’s General Data Protection Regulation (GDPR). According to the presenters, current research suggests that the measurable privacy benefits of paying for an app are “tenuous at best,” and consumers cannot be expected to make informed decisions because the necessary privacy information is not always available in the purchase program on a mobile device such as a phone. As for GDPR, the panel agreed that there are notable reductions in web use, with page views falling 9.7 percent in one study, although it is not clear whether such reduction is directly correlated to the May 25, 2018 effective date for enforcement of GDPR.
- Session 4: Vulnerabilities, Leaks, and Breach Notifications. In the final presentation, presenters discussed new research on how companies can mitigate data security vulnerabilities and improve remediation. One presenter discussed the need for proactive identification of vulnerabilities, noting that the goal should be to patch the real vulnerabilities and limit efforts related to vulnerabilities that are unlikely to be exploited. Another presenter analyzed data breach notifications to consumers, noting that all 50 states have data breach notification laws, but there is no consensus as to best practices related to the content or timing of notifications to consumers. The presenter concluded with recommendations for future notification regulations: (i) incorporate readability testing based on standardized methods; (ii) provide concrete guidelines of when customers need to be notified, what content needs to be included, and how the information should be presented; (iii) include visuals to highlight key information; and (iv) leverage the influence of templates, such as the model privacy form for the Gramm-Leach-Bliley Act.