Skip to main content
Menu Icon
Close

InfoBytes Blog

Financial Services Law Insights and Observations

Filter

Subscribe to our InfoBytes Blog weekly newsletter and other publications for news affecting the financial services industry.

  • FCC provides safe harbors for blocking illegal robocalls

    Privacy, Cyber Risk & Data Security

    On July 16, the FCC issued an order adopting rules to further encourage phone companies to block illegal and unwanted robocalls and to continue the Commission’s implementation of the TRACED Act (covered by InfoBytes here). The rule establishes two safe harbors from liability for the unintended or inadvertent blocking of wanted calls: (i) voice service providers will not be held liable under the Communications Act and FCC rules on terminating voice service providers that block calls, provided “reasonable analytics,” such as caller ID authentication information, are used to identify and block illegal or unwanted calls; and (ii) voice service providers will not be held liable for blocking calls from “bad-actor upstream voice service providers that continue to allow unwanted calls to traverse their networks.” The FCC’s order also includes a Further Notice of Proposed Rulemaking seeking comments on, among other things, “whether to obligate originating and intermediate providers to better police their networks against illegal calls,” whether the “reasonable analytics” safe harbor should be expanded “to include network-based blocking without consumer opt-out,” and whether the Commission should adopt more extensive redress requirements, and require terminating providers to provide consumers information about blocked calls.

    Privacy/Cyber Risk & Data Security FCC Robocalls TRACED Act

  • Court of Justice of the European Union invalidates EU-U.S. Privacy Shield; standard contractual clauses survive (for now)

    Privacy, Cyber Risk & Data Security

    On July 16, 2020, the Court of Justice of the European Union (CJEU) issued its opinion in the Schrems II case (Case C-311/18). In its opinion, the CJEU concluded that the Standard Contractual Clauses issued by the European Commission for the transfer of personal data to data processors established outside of the EU are valid. However, the Court invalidated the EU-U.S. Privacy Shield. The ruling cannot be appealed.

    Background

    In 2015, a privacy campaigner named Max Schrems filed a complaint with Ireland’s Data Protection Commissioner challenging a global social media company’s use of data transfers from servers in Ireland to servicers in the U.S. Schrems argued that U.S. laws did not offer sufficient protection of EU customer data, that EU customer data might be at risk of being accessed and processed by the U.S. government once transferred, and that there was no remedy available to EU individuals to ensure protection of their personal data after transfer to the U.S. Schrems sought the suspension or prohibition of future data transfers, which were executed by the company through standard data protection contractual clauses (a method approved by the Court in 2010 by Decision 2010/87). The social media company had utilized these standard contractual clauses after the CJEU invalidated the U.S. – EU Safe Harbor Framework in 2015.

    Following the complaint, Ireland’s Data Protection Commissioner brought proceedings against the social media company in the Irish High Court, which referred numerous questions to the CJEU for a preliminary ruling, including questions addressing the validity of the standard contractual clauses and the EU-U.S. Privacy Shield.

    CJEU Opinion – Standard Contractual Clauses (Decision 2010/87)

    Upon review of the recommendations from the CJEU’s Advocate General published on December 19, 2019, the CJEU found the Decision approving the use of contractual clauses to transfer personal data valid.

    The CJEU noted that the GDPR applies to the transfer of personal data for commercial purposes by a company operating in an EU member state to another company outside of the EU, notwithstanding the third-party country’s processing of the data under its own security laws. Moreover, the CJEU explained that data protection contractual clauses between an EU company and a company operating in a third-party country must afford a level of protection “essentially equivalent to that which is guaranteed within the European Union” under the GDPR. According to the CJEU, the level of protection must take into consideration not only the contractual clauses executed by the companies, but the “relevant aspects of the legal system of that third country.”

    As for the Decision 2010/87, the CJEU determined that it provides effective mechanisms to, in practice, ensure contractual clauses governing data transfers are in compliance with the level of protection requirement by the GDPR, and appropriately requires the suspension or prohibition of transfers in the event the clauses are breached or unable to be honored. The CJEU specifically highlighted the certification required by the EU data exporter and the third-party country recipient to verify, prior to any transfer, (i) the level of data protection in the third-party country prior to any transfer; and (ii) abilities to comply with the data protection clauses.

    CJEU Opinion - EU-U.S. Privacy Shield, (Decision 2016/1250)

    The CJEU decided to examine and rule on the validity of the EU – U.S. Privacy Shield. The CJEU determined that because the requirements of U.S. national security, public interest and law enforcement have “primacy” over the data protection principles of the EU-U.S. Privacy Shield, the data transferred under the EU-U.S. Privacy Shield would not be subject to the same level of protections prescribed by the GDPR. Specifically, the CJEU held that the surveillance programs used by U.S. authorities are not proportionally equivalent to those allowed under the EU law because they are not “limited to what is strictly necessary,” nor, under certain surveillance programs, does the U.S. “grant data subjects actionable rights before the courts against the U.S. authorities.” Moreover, the CJEU rejected the argument that the Ombudsperson mechanism satisfies the GDPR’s right to judicial protection, stating that it “does not provide any cause of action before a body which offers the persons whose data is transferred to the United States guarantees essentially equivalent to those required by [the GDPR],” and the Ombudsperson “cannot be regarded as a tribunal.” Thus, on those grounds, the CJEU declared the EU-U.S. Privacy Shield invalid.

    Privacy/Cyber Risk & Data Security GDPR European Union Of Interest to Non-US Persons EU-US Privacy Shield

  • California AG publishes CCPA FAQs

    Privacy, Cyber Risk & Data Security

    The California attorney general recently published a set of frequently asked questions providing general consumer information on the California Consumer Privacy Act (CCPA). The CCPA—enacted in June 2018 (covered by a Buckley Special Alert) and amended several times—became effective January 1. Final proposed regulations were submitted by the AG last month as required under the CCPA’s July 1 statutory deadline (covered by InfoBytes here), and are currently with the California Office of Administrative Law for review. The FAQs—which will be updated periodically and do not serve as legal advice, regulatory guidance, or as an opinion of the AG—are intended to provide consumers guidance on exercising their rights under the CCPA.

    • General CCPA information. The FAQs address consumer rights under the CCPA and reiterate that these rights apply only to California residents. This section also clarifies the definition of “personal information,” outlines businesses’ compliance thresholds, and states that the CCPA does not apply to nonprofit organizations and government agencies. The FAQs also remind consumers of their limited ability to sue businesses for CCPA violations and details the conditions that must be met before a consumer may sue a business for a data breach. The FAQs remind consumers that if they believe a business has violated the CCPA, they may file a complaint with the AG’s office.
    • Right to opt-out of sale. The FAQs answer common questions related to consumers’ requests for businesses not to sell their personal information. The FAQs provide information on the steps for submitting opt-out requests, as well as explanations for why a business may deny an opt-out request. It also address circumstances where a consumer receives a response from a service provider that says it is not required to act on an opt-out request.
    • Right to know. The FAQs discuss a consumer’s right to know what personal information is collected, used, shared, or sold, and clarifies what consumers should do to submit requests to know, how long a business may take to respond, and what steps should be taken if a business requests more information, denies a request to know, or claims to be a service provider that is not required to respond.
    • Required notices. The FAQs outline the disclosures that businesses must provide - i.e., the “notice at collection” and privacy policy. It also discusses the common places where notices at collection and privacy policies are located.
    • Request to delete. The FAQs address several questions related to consumers’ right to delete personal information, including how to submit a request to delete, businesses’ responses to and denials of requests to delete, and why a debt collector may make an attempt to collect a debt or a credit reporting agency may provide credit information even after a request to delete has been made.
    • Right to non-discrimination. Consumers are reminded that a business “cannot deny goods or services, charge. . .a different price, or provide a different level or quality of goods or services just because [a consumer] exercised [his or her] rights under the CCPA.”
    • Data brokers. The FAQs set forth the definition of a data broker under California law and outline steps for consumers interested in finding data brokers that collect and sell personal information, as well as measures consumers can take to opt-out of the sale of certain personal information.

    Privacy/Cyber Risk & Data Security State Issues CCPA California State Attorney General Opt-Out Disclosures

  • FTC settles with app developer for COPPA violations

    Privacy, Cyber Risk & Data Security

    On June 4, the FTC announced that a children’s mobile application developer agreed to pay $150,000 and to delete the personal information it allegedly unlawfully collected from children under the age of 13 to resolve allegations that the developer violated the Children’s Online Privacy Protection Act Rule (COPPA). According to the complaint filed in the U.S. District Court for the Northern District of California, the developer, without notifying parents or obtaining verifiable parental consent, allowed third-party advertising networks to use persistent identifiers to track users of the child-directed apps in order to send targeted advertisements to the children. The proposed settlement requires the developer to destroy any personal data collected from children under 13 and notify and obtain verifiable consent from parents for any child-directed app or website they offer that collects personal information from children under 13. A $4 million penalty is suspended upon the payment of $150,000 due to the developer’s inability to pay.

    In dissent, Commissioner Phillips argued that the fine imposed against the developer was too high, noting that having children view advertisements based on the collection of persistent identifiers “is something; but it is not everything,” under COPPA. Commissioner Phillips argued that because the developer did not “share[] sensitive personal information about children, or publicize[] it” nor did the developer expose children “to unauthorized contact from strangers, or otherwise put [the children] in danger,” the assessed penalty was too large in comparison to the harm.

    In response to the dissent, Chairman Simons argued that while “harm is an important factor to consider…[the FTC’s] first priority is to use [] penalties to deter [] practices. Even in the absence of demonstrable money harm, Congress has said that these law violations merit the imposition of civil penalties.”

    Privacy/Cyber Risk & Data Security FTC Enforcement COPPA Courts

  • $550 million preliminary settlement reached in biometric privacy class action

    Privacy, Cyber Risk & Data Security

    On May 8, plaintiffs in a biometric privacy class action in the U.S. District Court for the Northern District of California filed a motion requesting preliminary approval of a $550 million settlement deal. The preliminary settlement, reached between a global social media company and a class of Illinois users, would resolve consolidated class claims that alleged the social media company’s face scanning practices violated the Illinois Biometric Information Privacy Act (BIPA). As previously covered by InfoBytes, last August the U.S. Court of Appeals for the 9th Circuit affirmed class certification and held that the class’s claims met the standing requirement described in Spokeo, Inc. v. Robins because the social media company’s alleged development of a face template that used facial-recognition technology without users’ consent constituted an invasion of an individual’s private affairs and concrete interests. According to the motion for preliminary approval, the settlement would be the largest BIPA class action settlement ever and would provide “cash relief that far outstrips what class members typically receive in privacy settlements, even in cases in which substantial statutory damages are involved.” If approved, the social media company must also provide “forward-looking relief” to ensure it secures users’ informed, written consent as required under BIPA.

    Privacy/Cyber Risk & Data Security Courts Enforcement Consumer Protection Settlement Class Action State Issues

  • Court approves $5 billion FTC settlement with social media company

    Privacy, Cyber Risk & Data Security

    On April 23, the U.S. District Court for the District of Columbia approved a $5 billion settlement between the FTC and a global social media company, resolving allegations that the company violated consumer protection laws by using deceptive disclosures and settings to undermine users’ privacy preferences in violation of a 2012 privacy settlement with the FTC. The settlement, first announced last July (covered by InfoBytes here), requires the company to take a series of remedial steps, including (i) ceasing misrepresentations concerning its collection and disclosure of users’ personal information, as well as its privacy and security measures; (ii) clearly disclosing when it will share data with third parties and obtaining user express consent if the sharing goes beyond a user’s privacy setting restrictions; (iii) deleting or de-identifying a user’s personal information within a reasonable time frame if an account is closed; (iv) creating a more robust privacy program with safeguards applicable to third parties with access to a user’s personal information; (v) creating a new privacy committee and designating a dedicated corporate officer in charge of monitoring the effectiveness of the privacy program; (vi) alerting the FTC when more than 500 users’ personal information has been compromised; and (vii) undertaking reporting and recordkeeping obligations, and commissioning regular, independent privacy assessments. The order “resolves all consumer-protection claims known by the FTC prior to June 12, 2019, that [the company], its officers, and directors violated Section 5 of the FTC Act.” While the court acknowledged concerns raised by several amici opposing the settlement, the court concluded that the settlement and the proposed remedies were reasonable and in the public interest. On April 28, the FTC announced the formal approval of amendments to its 2012 privacy order to incorporate updated provisions included in the 2019 settlement.

    Privacy/Cyber Risk & Data Security FTC Enforcement Consumer Protection Settlement

  • Multi-jurisdiction settlement reached with credit reporting agency over 2017 data breach

    Privacy, Cyber Risk & Data Security

    On April 17, the Massachusetts attorney general announced a settlement with a credit reporting agency (CRA) to resolve a state investigation into a 2017 data breach that reportedly compromised the personal information of nearly three million Massachusetts residents. According to the AG’s 2017 complaint (covered by InfoBytes here), the CRA ignored cybersecurity vulnerabilities for months before the breach occurred and failed to take measures to implement and maintain reasonable safeguards. Under the terms of the proposed settlement, pending final court approval, the CRA will pay Massachusetts $18.2 million and is required to take significant measures to strengthen its security practices to ensure compliance with Massachusetts law. These measures include (i) implementing a comprehensive information security program; (ii) minimizing the collection of sensitive personal information; (iii) managing and implementing specific technical safeguards and controls; (iv) providing consumer-related relief, such as credit monitoring services and security freezes; and (iv) allowing third-party assessments of its data safeguards.

    Earlier, on April 14, the Indiana attorney general also announced that the CRA will pay the state $19.5 million to resolve allegations that it failed to protect Indiana residents whose personal information was exposed in the 2017 data breach. Under the terms of the final judgment and consent decree, in addition to paying $19.5 million in restitution, the CRA must take measures similar to those outlined in the Massachusetts settlement.

    Massachusetts and Indiana were the only two states that chose not to participate in the 2017 multi-agency settlement that resolved federal and state investigations into the data breach and required the company to pay up to $700 million (covered by InfoBytes here).

    Separately, on April 7, the City of Chicago announced a $1.5 million settlement to resolve allegations that the CRA’s failure to employ adequate data-security measures led to the breach.

    Privacy/Cyber Risk & Data Security State Attorney General Data Breach State Issues Credit Reporting Agency Settlement Massachusetts Indiana

  • FCC orders phone companies to deploy STIR/SHAKEN framework

    Privacy, Cyber Risk & Data Security

    On March 31, the FCC adopted new rules that will require phone companies in the U.S. to deploy STIR/SHAKEN caller ID authentication framework by June 30, 2021. As previously covered by InfoBytes, the STIR/SHAKEN framework addresses “unlawful spoofing by confirming that a call actually comes from the number indicated in the Caller ID, or at least that the call entered the US network through a particular voice service provider or gateway.” FCC Chairman Ajit Pai endorsed the value of widespread implementation, stating the framework will “reduce the effectiveness of illegal spoofing, allow law enforcement to identify bad actors more easily, and help phone companies identify—and even block—calls with illegal spoofed caller ID information before those calls reach their subscribers.” The new rules also contain a further notice of proposed rulemaking, which seeks comments on additional efforts to promote caller ID authentication and implement certain sections of the TRACED Act. Among other things, the TRACED Act—signed into law last December (covered by InfoBytes here)—mandated compliance with STIR/SHAKEN for all voice service providers.

    Privacy/Cyber Risk & Data Security FCC Robocalls Agency Rule-Making & Guidance

  • FTC report highlights 2019 privacy and data security work

    Privacy, Cyber Risk & Data Security

    On February 25, the FTC released its annual report highlighting the agency’s privacy and data security work in 2019. Among other items, the report highlights consumer-related enforcement activities in 2018, including:

    • A $5 billion penalty—the largest consumer privacy penalty to date—against a global social media company to resolve allegations that the company violated its 2012 FTC privacy order and mishandled users’ personal information. (Covered by InfoBytes here.)
    • A $170 million penalty against a global online search engine and its video-sharing subsidiary to resolve alleged violations of the Children’s Online Privacy Protection Act (COPPA). (Covered by InfoBytes here.) 
    • A proposed settlement in the FTC’s first case against developers of “stalking” apps that monitor consumers’ mobile devices and allegedly compromise consumer privacy in violation of the FTC’s Act prohibition against unfair and deceptive practices and COPPA.
    • A global settlement of up to $700 million issued in conjunction with the CFPB, 48 states, the District of Columbia and Puerto Rico, to resolve federal and state investigations into a 2017 data breach that reportedly compromised sensitive information for approximately 147 million consumers. (Covered by InfoBytes here.)

    The report also discusses the FTC’s enforcement of the EU-U.S. Privacy Shield framework, provides links to FTC congressional testimony on privacy and data security, and offers a list of relevant rulemaking, including rules currently under review. In addition, the report highlights recent privacy-related events, including (i) an FTC hearing examining consumer privacy as part of its Hearings on Competition and Consumer Protection in the 21st Century; (ii) the fourth annual PrivacyCon event, which hosted research presentations on consumer privacy and security issues (covered by InfoBytes here); (iii) a workshop examining possible updates to COPPA; and (iv) a public workshop that examined issues affecting consumer reporting accuracy.

    Privacy/Cyber Risk & Data Security FTC Enforcement Consumer Protection COPPA FTC Act UDAP Consumer Reporting

  • CFTC adopts NIST Privacy Framework

    Privacy, Cyber Risk & Data Security

    On January 28, the CFTC announced that it has adopted the National Institute of Standards and Technology (NIST) Privacy Framework, making it the first federal agency to do so. The September NIST release of a preliminary draft of the framework described it as “[a] Tool for Improving Privacy through Enterprise Risk Management,” covered by InfoBytes here. Among other things, the privacy framework, which advances guidance to mitigate cybersecurity risk, describes processes to mitigate risks associated with data processing and privacy breaches and to assess current privacy risk management measures. According to the announcement, the CFTC will utilize the framework to “better manage and communicate privacy risk throughout the agency,” making them a leader in the data privacy protection arena.

    Privacy/Cyber Risk & Data Security NIST CFTC Risk Management

Pages

Upcoming Events