Subscribe to our InfoBytes Blog weekly newsletter and other publications for news affecting the financial services industry.
The California attorney general recently published a set of frequently asked questions providing general consumer information on the California Consumer Privacy Act (CCPA). The CCPA—enacted in June 2018 (covered by a Buckley Special Alert) and amended several times—became effective January 1. Final proposed regulations were submitted by the AG last month as required under the CCPA’s July 1 statutory deadline (covered by InfoBytes here), and are currently with the California Office of Administrative Law for review. The FAQs—which will be updated periodically and do not serve as legal advice, regulatory guidance, or as an opinion of the AG—are intended to provide consumers guidance on exercising their rights under the CCPA.
- General CCPA information. The FAQs address consumer rights under the CCPA and reiterate that these rights apply only to California residents. This section also clarifies the definition of “personal information,” outlines businesses’ compliance thresholds, and states that the CCPA does not apply to nonprofit organizations and government agencies. The FAQs also remind consumers of their limited ability to sue businesses for CCPA violations and details the conditions that must be met before a consumer may sue a business for a data breach. The FAQs remind consumers that if they believe a business has violated the CCPA, they may file a complaint with the AG’s office.
- Right to opt-out of sale. The FAQs answer common questions related to consumers’ requests for businesses not to sell their personal information. The FAQs provide information on the steps for submitting opt-out requests, as well as explanations for why a business may deny an opt-out request. It also address circumstances where a consumer receives a response from a service provider that says it is not required to act on an opt-out request.
- Right to know. The FAQs discuss a consumer’s right to know what personal information is collected, used, shared, or sold, and clarifies what consumers should do to submit requests to know, how long a business may take to respond, and what steps should be taken if a business requests more information, denies a request to know, or claims to be a service provider that is not required to respond.
- Request to delete. The FAQs address several questions related to consumers’ right to delete personal information, including how to submit a request to delete, businesses’ responses to and denials of requests to delete, and why a debt collector may make an attempt to collect a debt or a credit reporting agency may provide credit information even after a request to delete has been made.
- Right to non-discrimination. Consumers are reminded that a business “cannot deny goods or services, charge. . .a different price, or provide a different level or quality of goods or services just because [a consumer] exercised [his or her] rights under the CCPA.”
- Data brokers. The FAQs set forth the definition of a data broker under California law and outline steps for consumers interested in finding data brokers that collect and sell personal information, as well as measures consumers can take to opt-out of the sale of certain personal information.
On June 4, the FTC announced that a children’s mobile application developer agreed to pay $150,000 and to delete the personal information it allegedly unlawfully collected from children under the age of 13 to resolve allegations that the developer violated the Children’s Online Privacy Protection Act Rule (COPPA Rule). According to the complaint filed in the U.S. District Court for the Northern District of California, the developer, without notifying parents or obtaining verifiable parental consent, allowed third-party advertising networks to use persistent identifiers to track users of the child-directed apps in order to send targeted advertisements to the children. The proposed settlement requires the developer to destroy any personal data collected from children under 13 and notify and obtain verifiable consent from parents for any child-directed app or website they offer that collects personal information from children under 13. A $4 million penalty is suspended upon the payment of $150,000 due to the developer’s inability to pay.
In dissent, Commissioner Phillips argued that the fine imposed against the developer was too high, noting that having children view advertisements based on the collection of persistent identifiers “is something; but it is not everything,” under COPPA. Commissioner Phillips argued that because the developer did not “share sensitive personal information about children, or publicize it” nor did the developer expose children “to unauthorized contact from strangers, or otherwise put [the children] in danger,” the assessed penalty was too large in comparison to the harm.
In response to the dissent, Chairman Simons argued that while “harm is an important factor to consider…[the FTC’s] first priority is to use  penalties to deter  practices. Even in the absence of demonstrable money harm, Congress has said that these law violations merit the imposition of civil penalties.”
On May 8, plaintiffs in a biometric privacy class action in the U.S. District Court for the Northern District of California filed a motion requesting preliminary approval of a $550 million settlement deal. The preliminary settlement, reached between a global social media company and a class of Illinois users, would resolve consolidated class claims that alleged the social media company’s face scanning practices violated the Illinois Biometric Information Privacy Act (BIPA). As previously covered by InfoBytes, last August the U.S. Court of Appeals for the 9th Circuit affirmed class certification and held that the class’s claims met the standing requirement described in Spokeo, Inc. v. Robins because the social media company’s alleged development of a face template that used facial-recognition technology without users’ consent constituted an invasion of an individual’s private affairs and concrete interests. According to the motion for preliminary approval, the settlement would be the largest BIPA class action settlement ever and would provide “cash relief that far outstrips what class members typically receive in privacy settlements, even in cases in which substantial statutory damages are involved.” If approved, the social media company must also provide “forward-looking relief” to ensure it secures users’ informed, written consent as required under BIPA.
On April 23, the U.S. District Court for the District of Columbia approved a $5 billion settlement between the FTC and a global social media company, resolving allegations that the company violated consumer protection laws by using deceptive disclosures and settings to undermine users’ privacy preferences in violation of a 2012 privacy settlement with the FTC. The settlement, first announced last July (covered by InfoBytes here), requires the company to take a series of remedial steps, including (i) ceasing misrepresentations concerning its collection and disclosure of users’ personal information, as well as its privacy and security measures; (ii) clearly disclosing when it will share data with third parties and obtaining user express consent if the sharing goes beyond a user’s privacy setting restrictions; (iii) deleting or de-identifying a user’s personal information within a reasonable time frame if an account is closed; (iv) creating a more robust privacy program with safeguards applicable to third parties with access to a user’s personal information; (v) creating a new privacy committee and designating a dedicated corporate officer in charge of monitoring the effectiveness of the privacy program; (vi) alerting the FTC when more than 500 users’ personal information has been compromised; and (vii) undertaking reporting and recordkeeping obligations, and commissioning regular, independent privacy assessments. The order “resolves all consumer-protection claims known by the FTC prior to June 12, 2019, that [the company], its officers, and directors violated Section 5 of the FTC Act.” While the court acknowledged concerns raised by several amici opposing the settlement, the court concluded that the settlement and the proposed remedies were reasonable and in the public interest. On April 28, the FTC announced the formal approval of amendments to its 2012 privacy order to incorporate updated provisions included in the 2019 settlement.
On April 17, the Massachusetts attorney general announced a settlement with a credit reporting agency (CRA) to resolve a state investigation into a 2017 data breach that reportedly compromised the personal information of nearly three million Massachusetts residents. According to the AG’s 2017 complaint (covered by InfoBytes here), the CRA ignored cybersecurity vulnerabilities for months before the breach occurred and failed to take measures to implement and maintain reasonable safeguards. Under the terms of the proposed settlement, pending final court approval, the CRA will pay Massachusetts $18.2 million and is required to take significant measures to strengthen its security practices to ensure compliance with Massachusetts law. These measures include (i) implementing a comprehensive information security program; (ii) minimizing the collection of sensitive personal information; (iii) managing and implementing specific technical safeguards and controls; (iv) providing consumer-related relief, such as credit monitoring services and security freezes; and (iv) allowing third-party assessments of its data safeguards.
Earlier, on April 14, the Indiana attorney general also announced that the CRA will pay the state $19.5 million to resolve allegations that it failed to protect Indiana residents whose personal information was exposed in the 2017 data breach. Under the terms of the final judgment and consent decree, in addition to paying $19.5 million in restitution, the CRA must take measures similar to those outlined in the Massachusetts settlement.
Massachusetts and Indiana were the only two states that chose not to participate in the 2017 multi-agency settlement that resolved federal and state investigations into the data breach and required the company to pay up to $700 million (covered by InfoBytes here).
Separately, on April 7, the City of Chicago announced a $1.5 million settlement to resolve allegations that the CRA’s failure to employ adequate data-security measures led to the breach.
On March 31, the FCC adopted new rules that will require phone companies in the U.S. to deploy STIR/SHAKEN caller ID authentication framework by June 30, 2021. As previously covered by InfoBytes, the STIR/SHAKEN framework addresses “unlawful spoofing by confirming that a call actually comes from the number indicated in the Caller ID, or at least that the call entered the US network through a particular voice service provider or gateway.” FCC Chairman Ajit Pai endorsed the value of widespread implementation, stating the framework will “reduce the effectiveness of illegal spoofing, allow law enforcement to identify bad actors more easily, and help phone companies identify—and even block—calls with illegal spoofed caller ID information before those calls reach their subscribers.” The new rules also contain a further notice of proposed rulemaking, which seeks comments on additional efforts to promote caller ID authentication and implement certain sections of the TRACED Act. Among other things, the TRACED Act—signed into law last December (covered by InfoBytes here)—mandated compliance with STIR/SHAKEN for all voice service providers.
On February 25, the FTC released its annual report highlighting the agency’s privacy and data security work in 2019. Among other items, the report highlights consumer-related enforcement activities in 2018, including:
- A $5 billion penalty—the largest consumer privacy penalty to date—against a global social media company to resolve allegations that the company violated its 2012 FTC privacy order and mishandled users’ personal information. (Covered by InfoBytes here.)
- A $170 million penalty against a global online search engine and its video-sharing subsidiary to resolve alleged violations of the Children’s Online Privacy Protection Act (COPPA). (Covered by InfoBytes here.)
- A proposed settlement in the FTC’s first case against developers of “stalking” apps that monitor consumers’ mobile devices and allegedly compromise consumer privacy in violation of the FTC’s Act prohibition against unfair and deceptive practices and COPPA.
- A global settlement of up to $700 million issued in conjunction with the CFPB, 48 states, the District of Columbia and Puerto Rico, to resolve federal and state investigations into a 2017 data breach that reportedly compromised sensitive information for approximately 147 million consumers. (Covered by InfoBytes here.)
The report also discusses the FTC’s enforcement of the EU-U.S. Privacy Shield framework, provides links to FTC congressional testimony on privacy and data security, and offers a list of relevant rulemaking, including rules currently under review. In addition, the report highlights recent privacy-related events, including (i) an FTC hearing examining consumer privacy as part of its Hearings on Competition and Consumer Protection in the 21st Century; (ii) the fourth annual PrivacyCon event, which hosted research presentations on consumer privacy and security issues (covered by InfoBytes here); (iii) a workshop examining possible updates to COPPA; and (iv) a public workshop that examined issues affecting consumer reporting accuracy.
On January 28, the CFTC announced that it has adopted the National Institute of Standards and Technology (NIST) Privacy Framework, making it the first federal agency to do so. The September NIST release of a preliminary draft of the framework described it as “[a] Tool for Improving Privacy through Enterprise Risk Management,” covered by InfoBytes here. Among other things, the privacy framework, which advances guidance to mitigate cybersecurity risk, describes processes to mitigate risks associated with data processing and privacy breaches and to assess current privacy risk management measures. According to the announcement, the CFTC will utilize the framework to “better manage and communicate privacy risk throughout the agency,” making them a leader in the data privacy protection arena.
In January, the Federal Reserve Bank of New York (New York Fed) released a staff report that analyzes how a cyber attack transmitted through a payment network could be amplified throughout the U.S. financial system. According to the report, Cyber Risk and the U.S. Financial System: a Pre-Mortem Analysis, cyber attacks that impair the most active U.S. banks’ ability to send payments “would likely be amplified to affect the liquidity of many other banks in the system,” including smaller or mid-sized banks that are connected through a shared service provider. The New York Fed notes, however, that the report’s primary focus is on a cyber attack’s impact within a single day, and cautions that should a cyber attack compromise the integrity of the banking system, “the reconciliation and repercussion process would be an unprecedented task.” Among other things, the report (i) establishes a framework for estimating “cyber vulnerability” and understanding the impairments of a cyber attack on a bank’s payment activities; (ii) creates a baseline scenario to study the five largest institutions within the wholesale payment network and the high concentration of payments between large institutions, as well as the resulting imbalance in liquidity that occurs if even a single large institution is unable to remit payments to its counterparties; and (iii) conducts a reverse stress test exercise, in which it analyzes “how many smaller institutions it would take to impair any of the most active ones,” in order to highlight “how the impairment of many smaller institutions also presents a systemic risk.”
On January 13, the U.S. District Court for the Northern District of Virginia issued a final order and judgment in a class action settlement between a class of consumers (plaintiffs) and a large credit reporting agency (company) to resolve allegations arising from a 2017 cyberattack causing a data breach of the company. After the company announced the breach, many consumers filed suit and were eventually joined into a proposed settlement class. As previously covered by InfoBytes, the plaintiffs alleged that the company (i) failed to provide appropriate security to protect stored personal consumer information; (ii) misled consumers regarding the effectiveness and capacity of its security; and (iii) failed to take proper action when vulnerabilities in their security system became known. The company and the plaintiffs later submitted a proposed settlement order to the court.
According to the final order and judgment, the court certified the settlement class of the approximately 147 million affected consumers, finding the class was adequately represented, and approved the “distribution and allocation plan” as fair and reasonable. In the order granting final approval of the settlement the company agreed to, among other things, pay $380.5 million into a settlement fund and potentially up to $125 million more to cover “certain out-of-pocket losses,” $77.5 million for attorneys’ fees, and approximately $1.4 million for reimbursement of expenses. Class members are eligible for additional benefits including up to 10 years of credit monitoring and identity theft protection services or cash compensation if they already have those services, as well as identity restoration services for seven years. The company also agreed to spend at least $1 billion on data security and technology in the next five years.
- Jonice Gray Tucker to discuss "Fair servicing in wake of Covid-19" at an American Bar Association webinar
- APPROVED Webcast: Maximizing vendor value
- Daniel P. Stipano to discuss "Cram for the exam: Best prep strategies for a regulatory examination" at an ACAMS webinar
- Melissa Klimkiewicz to discuss "Flood insurance basics" at the NAFCU Virtual Regulatory Compliance School
- Sasha Leonhardt to discuss "Privacy laws clarified" at the National Settlement Services Summit (NS3)
- Amanda R. Lawrence to discuss "New privacy legislation: Preparing for a major source of class action and enforcement activity going forward" at the American Conference Institute Consumer Finance Class Actions, Litigation & Government Enforcement Actions