Skip to main content
Menu Icon
Close

InfoBytes Blog

Financial Services Law Insights and Observations

Filter

Subscribe to our InfoBytes Blog weekly newsletter and other publications for news affecting the financial services industry.

  • FTC, DOJ sue maker of health app over data sharing

    Federal Issues

    On May 17, the DOJ filed a complaint on behalf of the FTC against a health app for violating the Health Breach Notification Rule (HBNR) by allegedly sharing users’ sensitive personal information with third parties, disclosing sensitive health data, and failing to notify users of these unauthorized disclosures. According to the complaint, users were allegedly repeatedly and falsely promised via privacy policies that their health information would not be shared with third parties without the user’s knowledge or consent, and that any collected data was non-identifiable and only used for the defendant’s own analytics or advertising. The FTC charged the defendant with failing to implement reasonable measures to address the privacy and data security risks created by its use of third-party automated tracking tools and for sharing health information used for advertising purposes without obtaining users’ affirmative express consent. Under the HBNR, companies with access to personal health records are required to notify users, the FTC, and media outlets in certain situations, if there has been an unauthorized acquisition of unsecured personal health information. The defendant also allegedly failed to impose limits on how third parties could use the data and failed to adequately encrypt data shared with third parties, thus subjecting the data to potential interception and/or seizure by bad actors.

    The proposed court order would require the defendant to pay a $100,000 civil penalty, and would permanently prohibit the company from sharing personal health data with third parties for advertising and from making future misrepresentations about its privacy practices. The defendant would also be required to (i) obtain user consent before sharing personal health data; (ii) limit data retention; (iii) request deletion of data shared with third parties; (iv) provide notices to users explaining the FTC’s allegations and the proposed settlement; and (v) implement comprehensive security and privacy programs to protect consumer data. The defendant has also agreed to pay a total of $100,000 to Connecticut, the District of Columbia, and Oregon (who collaborated with the FTC on the action) for violating state privacy laws with respect to its data sharing and privacy practices.

    Federal Issues Privacy, Cyber Risk & Data Security FTC DOJ Consumer Protection Health Breach Notification Rule Enforcement Connecticut District of Columbia Oregon

  • FTC proposes changes to Health Breach Notification Rule

    Agency Rule-Making & Guidance

    On May 18, the FTC issued a notice of proposed rulemaking (NPRM) and request for public comment on changes to its Health Breach Notification Rule (Rule), following a notice issued last September (covered by InfoBytes here) warning health apps and connected devices collecting or using consumers’ health information that they must comply with the Rule and notify consumers and others if a consumer’s health data is breached. The Rule also ensures that entities not covered by HIPAA are held accountable in the event of a security breach. The NPRM proposed several changes to the Rule, including modifying the definition of “[personal health records (PHR)] identifiable health information,” clarifying that a “breach of security” would include the unauthorized acquisition of identifiable health information, and specifying that “only entities that access or send unsecured PHR identifiable health information to a personal health record—rather than entities that access or send any information to a personal health record—qualify as PHR related entities.” The modifications would also authorize the expanded use of email and other electronic methods for providing notice of a breach to consumers and would expand the required content for notices “to include information about the potential harm stemming from the breach and the names of any third parties who might have acquired any unsecured personally identifiable health information.” Comments on the NPRM are due 60 days after publication in the Federal Register.

    The same day, the FTC also issued a policy statement warning businesses against making misleading claims about the accuracy or efficacy of biometric technologies like facial recognition. The FTC emphasized that the increased use of consumers’ biometric information and biometric information technologies (including those powered by machine learning) raises significant consumer privacy and data security concerns and increases the potential for bias and discrimination. The FTC stressed that it intends to combat unfair or deceptive acts and practices related to these issues and outlined several factors used to determine potential violations of the FTC Act.

    Agency Rule-Making & Guidance Federal Issues Privacy, Cyber Risk & Data Security FTC Consumer Protection Biometric Data Artificial Intelligence Unfair Deceptive UDAP FTC Act

  • Tennessee becomes 8th state to enact comprehensive privacy legislation

    Privacy, Cyber Risk & Data Security

    On May 11, the Tennessee governor signed HB 1181 to enact the Tennessee Information Protection Act (TIPA) and establish a framework for controlling and processing consumers’ personal data in the state. Tennessee is now the eighth state in the nation to enact comprehensive consumer privacy measures, following California, Colorado, Connecticut, Virginia, Utah, Iowa, and Indiana. TIPA applies to any person that conducts business in the state or produces products or services targeted to residents and, during a calendar year, (i) controls or processes personal data of at least 100,000 Tennessee residents or (ii) controls or processes personal data of at least 25,000 Tennessee residents and derives 50 percent of gross revenue from the sale of personal data. TIPA provides for several exemptions, including financial institutions and data governed by the Gramm-Leach-Bliley Act and certain other federal laws, as well as covered entities governed by the Health Insurance Portability and Accountability Act. Highlights of TIPA include:

    • Consumers’ rights. Under TIPA, consumers will be able to access their personal data; make corrections; request deletion of their data; obtain a copy of their data in a portable format; request what categories of information were sold or disclosed; and opt out of the sale of their data.
    • Controllers’ responsibilities. Data controllers under TIPA will be responsible for, among other things, (i) responding to consumers’ requests within 45 days unless extenuating circumstances arise and providing requested information free of charge, up to twice annually for each consumer; (ii) establishing an appeals process to allow consumer appeals within a reasonable time period after a controller’s refusal to take action on a consumer’s request; (iii) limiting the collection of data to what is required and reasonably necessary for a specified purpose; (iv) not processing data for reasons incompatible with the specified purpose; (v) securing personal data from unauthorized access; (vi) not processing data in violation of state or federal anti-discrimination laws; (vii) obtaining consumer consent in order to process sensitive data; (viii) ensuring contracts and agreements do not waive or limit consumers’ data rights; and (ix) providing clear and meaningful privacy notices. TIPA also sets forth obligations relating to contracts between a controller and a processor.
    • No private right of action but enforcement by state attorney general. TIPA explicitly prohibits a private right of action. Instead, it grants the state attorney general excusive authority to enforce the law and seek penalties of up to $15,000 per violation and treble damages for willful or knowing violations. The attorney general may also recover reasonable expenses, including attorney fees, for any initiated action.
    • Right to cure. Upon discovering a potential violation of TIPA, the attorney general must give the data controller written notice. The data controller then has 60 days to cure the alleged violation before the attorney general can file suit.
    • Affirmative defense. TIPA establishes an affirmative defense for violations for controllers and processors that adopt a privacy program “that reasonably conforms” to the National Institute of Standards and Technology Privacy Framework and complies with required provisions. Failing “to maintain a privacy program that reflects the controller or processor's data privacy practices to a reasonable degree of accuracy” will be considered an unfair and deceptive act or practice under Tennessee law.

    TIPA takes effect July 1, 2024.

    Privacy, Cyber Risk & Data Security State Issues State Legislation Tennessee Consumer Protection

  • France fines facial recognition company additional €5.2 million for noncompliance

    Privacy, Cyber Risk & Data Security

    On May 10, the French data protection agency, Commission Nationale de l’Informatique et des Libertés (CNIL), fined a facial recognition company an overdue penalty payment in the amount of €5.2 million for failing to comply with an October order. As previously covered by InfoBytes, last fall CNIL imposed a €20 million penalty against the company for allegedly violating the EU’s General Data Protection Regulation (GDPR) after investigations found that the company allegedly processed personal biometric data without a legal basis (a breach of article 6 of the GDPR), and failed to take into account an individual’s rights in an “effective and satisfactory way”—particularly with respect to requests for access to their data (a breach of articles 12, 15 and 17 of the GDPR). CNIL reported that the company had two months after receiving the October order to stop collecting and processing data on individuals located in France “without any legal basis, and to delete the data of these individuals, after responding to requests for access it received.” Because the company did not submit proof of compliance within this time frame, CNIL imposed an additional fine on top of the original penalty.

    Privacy, Cyber Risk & Data Security Courts Of Interest to Non-US Persons EU France GDPR Enforcement

  • CFPB general counsel highlights risks in payments industry

    Federal Issues

    On May 9, CFPB General Counsel and Senior Advisor to the Director, Seth Frotman, discussed the evolution of the payments system and its significant impact on consumer financial protection. Speaking before the Innovative Payments Association, Frotman commented that over the past few years, growth in the use of noncash payments (i.e. ACG, cards, and checks) accelerated faster from 2018 to 2021 than in any previous period, with the value of noncash payments since 2018 increasing nearly 10 percent per year, approaching almost $130 trillion in 2021. The value of ACH transfers and the number of card payments also increased tremendously, Frotman noted, pointing to a rapid decline in ATM cash withdrawals and the use of checks. He observed that the use of peer-to-peer (P2P) payment platforms and digital wallets is also growing quickly, with more traditional financial institutions redoubling their efforts to expand product offerings to capture market shares in this space. Additionally, several large tech firms, drawing on their significant customer bases and brand recognition, are looking to integrate payment services into their operating systems, with some offering payment products used by consumers daily, Frotman said.

    Addressing concerns relating to data harvesting and privacy, Frotman said the Bureau is concerned that companies, including big tech companies, are using payment data to engage in behavioral targeting or individualized marketing, while some companies are sharing detailed payments information with data brokers or third parties as a way to monetize data. These behaviors, which he said only increase as payment systems continue to grow, raise the potential for harm, including limiting competition and consumer choice and stifling innovation. Frotman added that these issues are not limited to big tech. Banks, Frotman said, are also rolling out digital wallets as a way to access payment information, and Buy Now Pay later lenders are collecting consumer data “to increase the likelihood of incremental sales and maximize the lifetime value extracted from each current, past, or potential borrower.” Frotman reminded attendees that the Bureau has several critical tools at its disposal to address concerns about how data is bought, sold, used, and protected, and warned the payments industry to comply with applicable legal requirements.

    Frotman also discussed challenges facing “gig” and other non-standard workers when trying to navigate consumer financial markets, particularly with respect to the intersection between how workers are being paid and the EFTA. According to Frotman, the Bureau is concerned about whether gig workers are being improperly required to receive payments through a particular financial institution or via a particular payment product or app. Frotman instructed employers to provide payment options that do not require workers to establish an account with a particular institution to ensure they do not run afoul of the EFTA’s “compulsory use” provision. Consumers who use a personal P2P app for work transactions are also entitled to EFTA protections with respect to fraud and error resolution, Frotman added. Frotman closed his remarks by touching briefly on liquidity and stability in the P2P payment system. He warned that consumers who use P2P payment products to store funds do not have the same level of protection as consumers who use traditional banking products.

    Federal Issues CFPB Payments Privacy, Cyber Risk & Data Security Consumer Finance Peer-to-Peer Digital Wallets EFTA

  • Crypto platform reaches $1.2 million settlement on alleged compliance failures

    State Issues

    On May 1, NYDFS issued a consent order against a cryptocurrency trading platform for engaging in alleged violations of the state’s cybersecurity regulation (23 NYCRR Part 500). According to the consent order, during examinations conducted in 2018 and 2020, NYDFS identified multiple alleged deficiencies in the respondent’s cybersecurity program, as required by both the cybersecurity regulation and the state’s virtual currency regulation (23 NYCRR Part 200). Following the examinations, NYDFS initiated an investigation into the respondent’s cybersecurity program. The Department concluded that the respondent failed to conduct periodic cybersecurity risk assessments “sufficient to inform the design of the cybersecurity program,” and failed to establish and maintain an effective cybersecurity program and implement a reviewed and board-approved written cybersecurity policy. Moreover, NYDFS claimed the respondent’s policies and procedures were not customized to meet the company’s needs and risks. Under the terms of the consent order, the respondent must pay a $1.2 million civil monetary penalty and submit quarterly progress reports to NYDFS detailing its remediation efforts. 

    State Issues Digital Assets Privacy, Cyber Risk & Data Security State Regulators NYDFS New York Enforcement Cryptocurrency 23 NYCRR Part 200 23 NYCRR Part 500 Virtual Currency

  • District Court dismisses FTC’s privacy claims in geolocation action

    Federal Issues

    On May 4, the U.S. District Court for the District of Ohio issued two separate rulings in a pair of related disputes between the FTC and a data broker. The disputes center around accusations made by the FTC last August that the data broker violated Section 5 of the FTC Act by unfairly selling precise geolocation data from hundreds of millions of mobile devices which can be used to trace individuals’ movements to and from sensitive locations (covered by InfoBytes here). The FTC sought a permanent injunction to stop the data broker’s practices, as well as additional relief. The data broker, upon learning that the FTC planned to filed a lawsuit against it, filed a preemptive lawsuit challenging the agency’s authority.

    The court first dismissed the data broker’s preemptive bid to block the FTC’s enforcement action, ruling that the data broker has not identified any “viable cause of action” to support its request for injunctive relief. The court explained that injunctive relief is a “drastic remedy” that is only available if no other legal remedy is available. However, the data broker possesses an “adequate remedy at law,” the court said, “because it can seek dismissal of, and otherwise directly defend against, the FTC’s enforcement action.”

    With respect to the FTC’s action, the court granted the data broker’s motion to dismiss the FTC’s complaint, but gave the agency leave to amend. The court agreed with the data broker that the FTC’s complaint lacks sufficient allegations to support its unfairness claim under Section 5 of the FTC Act. While the court disagreed with the data broker’s assertion that it did not have “fair notice that its sale of geolocation data without restrictions near sensitive locations could violate Section 5(a) of the FTC Act” or that the FTC had to allege a predicate violation of law or policy to state a claim, the court determined that the FTC failed to adequately allege that the data broker’s practices created “a ‘significant risk’ of concrete harm.” Moreover, the court found that “the purported privacy intrusion is not severe enough to constitute ‘substantial injury’ under Section 5(n).” The court noted, however that some of the deficiencies may be cured through additional factual allegations in an amended complaint.

    Federal Issues Courts Privacy, Cyber Risk & Data Security FTC Enforcement Data Brokers FTC Act UDAP Unfair

  • EU court says non-material damages in unlawful data processing may be eligible for compensation

    Privacy, Cyber Risk & Data Security

    On May 4, the Court of Justice of the European Union (CJEU) issued a judgment concluding that while not every infringement of the EU’s data protection law gives rise, by itself, to a right to compensation, non-material damage resulting from unlawful processing of data can be eligible for compensation. The CJEU reviewed questions posed by the Austrian Supreme Court on whether a mere infringement of the GDPR is sufficient to confer the right to compensation for individuals suffering non-material damages, and whether such compensation is possible only if the non-material damage suffered reaches a certain degree of seriousness. The Austrian Supreme Court also asked the CJEU to clarify what the EU-law requirements are when determining the amount of damages.

    The CJEU clarified that the General Data Protection Regulation (GDPR) does not set thresholds for the “seriousness” of damages needed to confer a right to compensation. “[I]t is clear that the right to compensation provided for by the GDPR is subject to three cumulative conditions: infringement of the GDPR, material or non-material damage resulting from that infringement and a causal link between the damage and the infringement,” the court said in the announcement. Limiting the right to compensation to non-material damage that reaches a certain threshold requirement would be contrary to the broad conception of “damage” outlined in EU law, the CJEU explained, pointing out that obtaining compensation based on a certain threshold would result in different outcomes depending on a court’s assessment. Moreover, the CJEU emphasized that because the GDPR does not contain any rules governing the assessment of damages, it is up to the each member state’s legal system to prescribe detailed rules for actions intended to safeguard individual’s rights under the GDPR, as well as the criteria for determining the amount of compensation, provided the determination complies with the principles of equivalence and effectiveness. The CJEU explained in its ruling that “an infringement of the GDPR does not necessarily result in damage, and [] that there must be a causal link between the infringement in question and the damage suffered by the data subject in order to establish a right to compensation.”

    Privacy, Cyber Risk & Data Security Courts Of Interest to Non-US Persons EU GDPR Consumer Protection

  • ID verifier to pay $28.5 million to settle BIPA allegations

    Privacy, Cyber Risk & Data Security

    On May 5, the U.S. District Court for the Northern District of Illinois preliminarily approved an amended class action settlement in which an identification verification service provider agreed to pay $28.5 million to settle allegations that it violated the Illinois Biometric Information Privacy Act (BIPA). According to the plaintiffs, the defendant collected, stored, and or used class members’ biometric data without authorization when they uploaded photos and state IDs on a mobile app belonging to one of the defendant’s customers. After the court denied the defendant’s move to compel arbitration and determined the plaintiff had standing to pursue his BIPA claims, the parties entered into settlement discussions without the defendant admitting any allegations or liability. The court certified two classes: (i) Illinois residents who uploaded photos to the defendant through the app or website of a financial institution (class members will receive $15.7 million); and (ii) Illinois residents who uploaded photos through a non-financial institution (class members will receive $12.8 million). A final approval hearing will determine attorney’s fees and expenses and incentive awards.

    Privacy, Cyber Risk & Data Security Courts State Issues Illinois Class Action Settlement Consumer Protection BIPA

  • Indiana becomes seventh state to enact comprehensive privacy legislation

    Privacy, Cyber Risk & Data Security

    On May 1, the Indiana governor signed SB 5 to establish a framework for controlling and processing consumers’ personal data in the state. Indiana is now the seventh state in the nation to enact comprehensive consumer privacy measures, following California, Colorado, Connecticut, Virginia, Utah, and Iowa (covered by Special Alerts here and here and InfoBytes here, here, here, and here). The Act applies to any person that conducts business in the state or produces products or services targeted to residents and, during a calendar year, (i) controls or processes personal data of at least 100,000 Indiana residents or (ii) controls or processes personal data of at least 25,000 Indiana residents and derives more than 50 percent of gross revenue from the sale of personal data. The Act outlines exemptions, including financial institutions and data subject to the Gramm-Leach-Bliley Act, as well as covered entities governed by the Health Insurance Portability and Accountability Act.

    Indiana consumers will have the right to, among other things, (i) confirm whether their personal data is being processed and access their data; (ii) correct inaccuracies; (iii) delete their data; (iv) obtain a copy of personal data processed by a controller; and (v) opt out of the processing of their data for targeted advertising, the sale of their data, or certain profiling. The Act outlines data controller responsibilities, including a requirement that controllers must respond to consumers’ requests within 45 days unless extenuating circumstances arise. The Act also limits the collection of personal data “to what is adequate, relevant, and reasonably necessary in relation to the purposes for which such data is processed, as disclosed to the consumer,” and requires controllers to implement data security protection practices “appropriate to the volume and nature of the personal data at issue” and conduct data protection assessments for processing activities created on or generated after December 31, 2025, that present a heightened risk of harm to consumers. Under the Act, controllers may not process consumers’ personal data without first obtaining consent, or in the case of a minor, without processing such data in accordance with the Children’s Online Privacy Protection Act. Additionally, the Act sets forth obligations relating to contracts between a controller and a processor.

    While the Act explicitly prohibits its use as a basis for a private right of action, it does grant the state attorney general exclusive authority to enforce the law. Additionally, upon discovering a potential violation of the Act, the attorney general must give the controller or processor written notice and 30 days to cure the alleged violation before the attorney general can file suit. The attorney general may seek injunctive relief and civil penalties not to exceed $7,500 for each violation.

    The Act takes effect January 1, 2026.

    Privacy, Cyber Risk & Data Security State Issues State Legislation Indiana Consumer Protection COPPA

Pages

Upcoming Events