Subscribe to our InfoBytes Blog weekly newsletter and other publications for news affecting the financial services industry.
District Court approves $4.3 million data breach settlement
Earlier this month, the International Organization of Securities Commissions (IOSCO) released draft policy recommendations to support greater regulatory and oversight consistency within the crypto and digital assets markets. According to the global securities watchdog, regulators must strive for consistency in their oversight of crypto-asset activities given the cross-border nature of these markets and the varying approaches taken by individual jurisdictions. Seeking to optimize consistency in the way crypto-asset and securities markets are regulated, the IOSCO advised regulators to enhance cooperation efforts and attempt “to achieve regulatory outcomes for investor protection and market integrity that are the same as, or consistent with, those required in traditional financial markets in order to facilitate a level-playing field between crypto-assets and traditional financial markets and help reduce the risk of regulatory arbitrage.” Encouraging regulators to engage in rulemaking and information sharing, the IOSCO presented a comprehensive strategy for harmonizing the oversight of crypto companies, including standards on conflicts of interest and governance, fraud and market abuse, cross-border cooperation, custody of client monies and assets, and operational and technological risks. The IOSCO also suggested measures for reducing money laundering risks, explaining that crypto assets may be more appealing to criminals who want to avoid traditional financial system oversight. The IOSCO noted that its goal is to finalize its policy recommendations in early Q4 2023. Comments will be received through July 31.
Montana becomes the ninth state to enact comprehensive privacy legislation
On May 19, the Montana governor signed SB 384 to enact the Consumer Data Privacy Act (CDPA) and establish a framework for controlling and processing consumer personal data in the state. Montana is now the ninth state in the nation to enact comprehensive consumer privacy measures, following California, Colorado, Connecticut, Virginia, Utah, Iowa, Indiana, and Tennessee. The CDPA applies to any person that conducts business in the state or produces products or services targeted to state residents and, during a calendar year, (i) controls or processes personal data of at least 50,000 consumers (“excluding personal data controlled or processed solely for the purpose of completing a payment transaction”), or (ii) controls or processes personal data of at least 25,000 consumers and derives 25 percent of gross revenue from the sale of personal data. The CDPA provides several exemptions, including nonprofit organizations, registered securities associations, financial institutions, data governed by the Gramm-Leach-Bliley Act and certain other federal laws, and covered entities governed by the Health Insurance Portability and Accountability Act. Highlights of the CDPA include:
- Consumers’ rights. Under the CDPA, consumers will be able to access their personal data; correct inaccuracies; request deletion of their data; obtain a copy of their data in a portable format; and opt out of the sale of their data. A consumer may also designate an authorized agent to act on the consumer’s behalf to opt out of the processing of their personal data.
- Data controllers’ responsibilities. Data controllers under the CDPA will be responsible for, among other things, (i) responding to consumer requests within 45 days unless extenuating circumstances arise and providing requested information free of charge, one for each consumer during a 12-month period; (ii) establishing a process to allow consumer appeals within a reasonable time period after a controller’s refusal to take action on a consumer’s request; (iii) establishing clear and conspicuous opt-out methods on a website that require consumers to affirmatively and freely choose to opt out of any processing of their personal data (and allowing for a mechanism that lets consumers revoke consent that is at least as easy as the mechanism used to provide consent); (iv) limiting the collection of data to what is adequate, relevant, and reasonably necessary for a specified purpose; (v) securing personal data from unauthorized access; (vi) processing data in compliance with state and federal anti-discrimination laws; (vii) obtaining consumer consent in order to process sensitive data; (viii) providing clear and meaningful privacy notices; and (ix) conducting data protection assessments and ensuring deidentified data cannot be associated with a consumer. The CDPA also sets forth obligations relating to contracts between a controller and a processor, including ensuring that contracts between a controller and a processor do not waive or limit consumer data rights.
- No private right of action but enforcement by state attorney general. The CDPA explicitly prohibits a private right of action. Instead, it grants the state attorney general excusive authority to enforce the law.
- Right to cure. Upon discovering a potential violation of the CDPA, the attorney general must give the data controller notice. The data controller then has 60 days to cure the alleged violation before the attorney general can file suit. The cure provision expires April 1, 2026.
The CDPA takes effect October 1, 2024.
FTC, DOJ sue maker of health app over data sharing
On May 17, the DOJ filed a complaint on behalf of the FTC against a health app for violating the Health Breach Notification Rule (HBNR) by allegedly sharing users’ sensitive personal information with third parties, disclosing sensitive health data, and failing to notify users of these unauthorized disclosures. According to the complaint, users were allegedly repeatedly and falsely promised via privacy policies that their health information would not be shared with third parties without the user’s knowledge or consent, and that any collected data was non-identifiable and only used for the defendant’s own analytics or advertising. The FTC charged the defendant with failing to implement reasonable measures to address the privacy and data security risks created by its use of third-party automated tracking tools and for sharing health information used for advertising purposes without obtaining users’ affirmative express consent. Under the HBNR, companies with access to personal health records are required to notify users, the FTC, and media outlets in certain situations, if there has been an unauthorized acquisition of unsecured personal health information. The defendant also allegedly failed to impose limits on how third parties could use the data and failed to adequately encrypt data shared with third parties, thus subjecting the data to potential interception and/or seizure by bad actors.
The proposed court order would require the defendant to pay a $100,000 civil penalty, and would permanently prohibit the company from sharing personal health data with third parties for advertising and from making future misrepresentations about its privacy practices. The defendant would also be required to (i) obtain user consent before sharing personal health data; (ii) limit data retention; (iii) request deletion of data shared with third parties; (iv) provide notices to users explaining the FTC’s allegations and the proposed settlement; and (v) implement comprehensive security and privacy programs to protect consumer data. The defendant has also agreed to pay a total of $100,000 to Connecticut, the District of Columbia, and Oregon (who collaborated with the FTC on the action) for violating state privacy laws with respect to its data sharing and privacy practices.
FTC proposes changes to Health Breach Notification Rule
On May 18, the FTC issued a notice of proposed rulemaking (NPRM) and request for public comment on changes to its Health Breach Notification Rule (Rule), following a notice issued last September (covered by InfoBytes here) warning health apps and connected devices collecting or using consumers’ health information that they must comply with the Rule and notify consumers and others if a consumer’s health data is breached. The Rule also ensures that entities not covered by HIPPA are held accountable in the event of a security breach. The NPRM proposed several changes to the Rule, including modifying the definition of “[personal health records (PHR)] identifiable health information,” clarifying that a “breach of security” would include the unauthorized acquisition of identifiable health information, and specifying that “only entities that access or send unsecured PHR identifiable health information to a personal health record—rather than entities that access or send any information to a personal health record—qualify as PHR related entities.” The modifications would also authorize the expanded use of email and other electronic methods for providing notice of a breach to consumers and would expand the required content for notices “to include information about the potential harm stemming from the breach and the names of any third parties who might have acquired any unsecured personally identifiable health information.” Comments on the NPRM are due 60 days after publication in the Federal Register.
The same day, the FTC also issued a policy statement warning businesses against making misleading claims about the accuracy or efficacy of biometric technologies like facial recognition. The FTC emphasized that the increased use of consumers’ biometric information and biometric information technologies (including those powered by machine learning) raises significant consumer privacy and data security concerns and increases the potential for bias and discrimination. The FTC stressed that it intends to combat unfair or deceptive acts and practices related to these issues and outlined several factors used to determine potential violations of the FTC Act.
Tennessee becomes 8th state to enact comprehensive privacy legislation
On May 11, the Tennessee governor signed HB 1181 to enact the Tennessee Information Protection Act (TIPA) and establish a framework for controlling and processing consumers’ personal data in the state. Tennessee is now the eighth state in the nation to enact comprehensive consumer privacy measures, following California, Colorado, Connecticut, Virginia, Utah, Iowa, and Indiana. TIPA applies to any person that conducts business in the state or produces products or services targeted to residents and, during a calendar year, (i) controls or processes personal data of at least 100,000 Tennessee residents or (ii) controls or processes personal data of at least 25,000 Tennessee residents and derives 50 percent of gross revenue from the sale of personal data. TIPA provides for several exemptions, including financial institutions and data governed by the Gramm-Leach-Bliley Act and certain other federal laws, as well as covered entities governed by the Health Insurance Portability and Accountability Act. Highlights of TIPA include:
- Consumers’ rights. Under TIPA, consumers will be able to access their personal data; make corrections; request deletion of their data; obtain a copy of their data in a portable format; request what categories of information were sold or disclosed; and opt out of the sale of their data.
- Controllers’ responsibilities. Data controllers under TIPA will be responsible for, among other things, (i) responding to consumers’ requests within 45 days unless extenuating circumstances arise and providing requested information free of charge, up to twice annually for each consumer; (ii) establishing an appeals process to allow consumer appeals within a reasonable time period after a controller’s refusal to take action on a consumer’s request; (iii) limiting the collection of data to what is required and reasonably necessary for a specified purpose; (iv) not processing data for reasons incompatible with the specified purpose; (v) securing personal data from unauthorized access; (vi) not processing data in violation of state or federal anti-discrimination laws; (vii) obtaining consumer consent in order to process sensitive data; (viii) ensuring contracts and agreements do not waive or limit consumers’ data rights; and (ix) providing clear and meaningful privacy notices. TIPA also sets forth obligations relating to contracts between a controller and a processor.
- No private right of action but enforcement by state attorney general. TIPA explicitly prohibits a private right of action. Instead, it grants the state attorney general excusive authority to enforce the law and seek penalties of up to $15,000 per violation and treble damages for willful or knowing violations. The attorney general may also recover reasonable expenses, including attorney fees, for any initiated action.
- Right to cure. Upon discovering a potential violation of TIPA, the attorney general must give the data controller written notice. The data controller then has 60 days to cure the alleged violation before the attorney general can file suit.
- Affirmative defense. TIPA establishes an affirmative defense for violations for controllers and processors that adopt a privacy program “that reasonably conforms” to the National Institute of Standards and Technology Privacy Framework and complies with required provisions. Failing “to maintain a privacy program that reflects the controller or processor's data privacy practices to a reasonable degree of accuracy” will be considered an unfair and deceptive act or practice under Tennessee law.
TIPA takes effect July 1, 2024.
France fines facial recognition company additional €5.2 million for noncompliance
On May 10, the French data protection agency, Commission Nationale de l’Informatique et des Libertés (CNIL), fined a facial recognition company an overdue penalty payment in the amount of €5.2 million for failing to comply with an October order. As previously covered by InfoBytes, last fall CNIL imposed a €20 million penalty against the company for allegedly violating the EU’s General Data Protection Regulation (GDPR) after investigations found that the company allegedly processed personal biometric data without a legal basis (a breach of article 6 of the GDPR), and failed to take into account an individual’s rights in an “effective and satisfactory way”—particularly with respect to requests for access to their data (a breach of articles 12, 15 and 17 of the GDPR). CNIL reported that the company had two months after receiving the October order to stop collecting and processing data on individuals located in France “without any legal basis, and to delete the data of these individuals, after responding to requests for access it received.” Because the company did not submit proof of compliance within this time frame, CNIL imposed an additional fine on top of the original penalty.
CFPB general counsel highlights risks in payments industry
On May 9, CFPB General Counsel and Senior Advisor to the Director, Seth Frotman, discussed the evolution of the payments system and its significant impact on consumer financial protection. Speaking before the Innovative Payments Association, Frotman commented that over the past few years, growth in the use of noncash payments (i.e. ACG, cards, and checks) accelerated faster from 2018 to 2021 than in any previous period, with the value of noncash payments since 2018 increasing nearly 10 percent per year, approaching almost $130 trillion in 2021. The value of ACH transfers and the number of card payments also increased tremendously, Frotman noted, pointing to a rapid decline in ATM cash withdrawals and the use of checks. He observed that the use of peer-to-peer (P2P) payment platforms and digital wallets is also growing quickly, with more traditional financial institutions redoubling their efforts to expand product offerings to capture market shares in this space. Additionally, several large tech firms, drawing on their significant customer bases and brand recognition, are looking to integrate payment services into their operating systems, with some offering payment products used by consumers daily, Frotman said.
Addressing concerns relating to data harvesting and privacy, Frotman said the Bureau is concerned that companies, including big tech companies, are using payment data to engage in behavioral targeting or individualized marketing, while some companies are sharing detailed payments information with data brokers or third parties as a way to monetize data. These behaviors, which he said only increase as payment systems continue to grow, raise the potential for harm, including limiting competition and consumer choice and stifling innovation. Frotman added that these issues are not limited to big tech. Banks, Frotman said, are also rolling out digital wallets as a way to access payment information, and Buy Now Pay later lenders are collecting consumer data “to increase the likelihood of incremental sales and maximize the lifetime value extracted from each current, past, or potential borrower.” Frotman reminded attendees that the Bureau has several critical tools at its disposal to address concerns about how data is bought, sold, used, and protected, and warned the payments industry to comply with applicable legal requirements.
Frotman also discussed challenges facing “gig” and other non-standard workers when trying to navigate consumer financial markets, particularly with respect to the intersection between how workers are being paid and the EFTA. According to Frotman, the Bureau is concerned about whether gig workers are being improperly required to receive payments through a particular financial institution or via a particular payment product or app. Frotman instructed employers to provide payment options that do not require workers to establish an account with a particular institution to ensure they do not run afoul of the EFTA’s “compulsory use” provision. Consumers who use a personal P2P app for work transactions are also entitled to EFTA protections with respect to fraud and error resolution, Frotman added. Frotman closed his remarks by touching briefly on liquidity and stability in the P2P payment system. He warned that consumers who use P2P payment products to store funds do not have the same level of protection as consumers who use traditional banking products.
Crypto platform reaches $1.2 million settlement on alleged compliance failures
On May 1, NYDFS issued a consent order against a cryptocurrency trading platform for engaging in alleged violations of the state’s cybersecurity regulation (23 NYCRR Part 500). According to the consent order, during examinations conducted in 2018 and 2020, NYDFS identified multiple alleged deficiencies in the respondent’s cybersecurity program, as required by both the cybersecurity regulation and the state’s virtual currency regulation (23 NYCRR Part 200). Following the examinations, NYDFS initiated an investigation into the respondent’s cybersecurity program. The Department concluded that the respondent failed to conduct periodic cybersecurity risk assessments “sufficient to inform the design of the cybersecurity program,” and failed to establish and maintain an effective cybersecurity program and implement a reviewed and board-approved written cybersecurity policy. Moreover, NYDFS claimed the respondent’s policies and procedures were not customized to meet the company’s needs and risks. Under the terms of the consent order, the respondent must pay a $1.2 million civil monetary penalty and submit quarterly progress reports to NYDFS detailing its remediation efforts.
District Court dismisses FTC’s privacy claims in geolocation action
On May 4, the U.S. District Court for the District of Ohio issued two separate rulings in a pair of related disputes between the FTC and a data broker. The disputes center around accusations made by the FTC last August that the data broker violated Section 5 of the FTC Act by unfairly selling precise geolocation data from hundreds of millions of mobile devices which can be used to trace individuals’ movements to and from sensitive locations (covered by InfoBytes here). The FTC sought a permanent injunction to stop the data broker’s practices, as well as additional relief. The data broker, upon learning that the FTC planned to filed a lawsuit against it, filed a preemptive lawsuit challenging the agency’s authority.
The court first dismissed the data broker’s preemptive bid to block the FTC’s enforcement action, ruling that the data broker has not identified any “viable cause of action” to support its request for injunctive relief. The court explained that injunctive relief is a “drastic remedy” that is only available if no other legal remedy is available. However, the data broker possesses an “adequate remedy at law,” the court said, “because it can seek dismissal of, and otherwise directly defend against, the FTC’s enforcement action.”
With respect to the FTC’s action, the court granted the data broker’s motion to dismiss the FTC’s complaint, but gave the agency leave to amend. The court agreed with the data broker that the FTC’s complaint lacks sufficient allegations to support its unfairness claim under Section 5 of the FTC Act. While the court disagreed with the data broker’s assertion that it did not have “fair notice that its sale of geolocation data without restrictions near sensitive locations could violate Section 5(a) of the FTC Act” or that the FTC had to allege a predicate violation of law or policy to state a claim, the court determined that the FTC failed to adequately allege that the data broker’s practices created “a ‘significant risk’ of concrete harm.” Moreover, the court found that “the purported privacy intrusion is not severe enough to constitute ‘substantial injury’ under Section 5(n).” The court noted, however that some of the deficiencies may be cured through additional factual allegations in an amended complaint.
EU court says non-material damages in unlawful data processing may be eligible for compensation
On May 4, the Court of Justice of the European Union (CJEU) issued a judgment concluding that while not every infringement of the EU’s data protection law gives rise, by itself, to a right to compensation, non-material damage resulting from unlawful processing of data can be eligible for compensation. The CJEU reviewed questions posed by the Austrian Supreme Court on whether a mere infringement of the GDPR is sufficient to confer the right to compensation for individuals suffering non-material damages, and whether such compensation is possible only if the non-material damage suffered reaches a certain degree of seriousness. The Austrian Supreme Court also asked the CJEU to clarify what the EU-law requirements are when determining the amount of damages.
The CJEU clarified that the General Data Protection Regulation (GDPR) does not set thresholds for the “seriousness” of damages needed to confer a right to compensation. “[I]t is clear that the right to compensation provided for by the GDPR is subject to three cumulative conditions: infringement of the GDPR, material or non-material damage resulting from that infringement and a causal link between the damage and the infringement,” the court said in the announcement. Limiting the right to compensation to non-material damage that reaches a certain threshold requirement would be contrary to the broad conception of “damage” outlined in EU law, the CJEU explained, pointing out that obtaining compensation based on a certain threshold would result in different outcomes depending on a court’s assessment. Moreover, the CJEU emphasized that because the GDPR does not contain any rules governing the assessment of damages, it is up to the each member state’s legal system to prescribe detailed rules for actions intended to safeguard individual’s rights under the GDPR, as well as the criteria for determining the amount of compensation, provided the determination complies with the principles of equivalence and effectiveness. The CJEU explained in its ruling that “an infringement of the GDPR does not necessarily result in damage, and  that there must be a causal link between the infringement in question and the damage suffered by the data subject in order to establish a right to compensation.”