Skip to main content
Menu Icon
Close

InfoBytes Blog

Financial Services Law Insights and Observations

Filter

Subscribe to our InfoBytes Blog weekly newsletter and other publications for news affecting the financial services industry.

  • FTC orders mental health service company to pay for privacy and data violations

    Federal Issues

    On April 15, the FTC released its administrative complaint and joint stipulated order against a mental health service provider, requiring the provider to pay a total of more than $7 million, including $5.1 million for consumer refunds and $2 million in civil penalties. According to the complaint, the defendant collected sensitive personal health information and sold online mental healthcare treatments (i.e., telehealth) through its website to “hundreds of thousands” of patients between 2021 to 2022. The FTC alleged the mental health service provider had engaged in deceptive and unfair practices relating to the marketing of its data security practices, like failing to disclose material items, failing to obtain consumers’ express informed consent, and failing to implement adequate data security measures. In addition, the FTC alleged that the provider misled consumers about its cancellation of services, including failure to provide a mechanism to stop recurring charges. The FTC’s complaint specifically found that the company misrepresented how it would use and disclose patients’ personal information, mishandled and exposed “hundreds of thousands” of personal information, and failed to provide a means to cancel subscriptions. The FTC charged the defendant with violating Section 5 of the FTC Act covering deceptive privacy practices, deceptive data security practices, unfair privacy and data security practices, and deceptive cancellation practices – allegedly violating the Opioid Act, and violating the Restore Online Shoppers’ Confidence Act (ROSCA).

    In the joint stipulated order, although the defendant neither admitted nor denied these allegations, the judgment prohibited the defendant from disclosing any covered information to any third party for advertising purposes, disclosing any covered information to an outside party without obtaining a consumer’s affirmative express consent, and misrepresenting its cancellation policies. The order also required the defendant to implement stronger protections of the private information of individuals and initiate regular assessments of its data security practices. The court ordered the defendant to pay $5,087,252 as monetary relief to consumers and a civil money penalty of $10 million, which the FTC agreed to suspend in exchange for a payment of $2 million, based on the defendant’s inability to pay the full civil money penalty.

    Federal Issues FTC Privacy, Cyber Risk & Data Security ROSCA

  • Massachusetts’ attorney general issues AI guidance related to state UDAP law

    Privacy, Cyber Risk & Data Security

    On April 16, the Attorney General for Massachusetts (AG) released an advisory notice on how developers, suppliers and users of artificial intelligence (AI) should avoid “unfair and deceptive” practices to comply with consumer protection laws. The AG noted how AI systems could pose consumer harms, including through bias, lack of transparency, and data privacy issues – since consumers often lack the ability to avoid or test the “appropriateness” of AI systems forced upon them. Chapter 93A of Massachusetts law, the Massachusetts Consumer Protection Act, protected consumers against “unfair and deceptive” practices, the definition of which has changed over time. In addition to the consumer protection law, the AG highlighted several other state and federal consumer protections, including the ECOA, to bolster her advisory.

    The AG’s advisory construed Chapter 93A to apply to AI, clarifying that the following practices may qualify as “unfair or deceptive”: (i) a company falsely advertising the quality of its AI systems; (ii) a company suppling a defective or impractical AI system; (iii) a company misrepresenting the reliability or safety of its AI system; (iv) a company putting an AI system up for sale in breach of warranty, meaning that the system was unfit for the purpose for which it was sold; (v) a company using multimedia content to impersonate or deceive (such as using a deep fake, voice cloning, or chatbots within fraud); (vi) or a company failing to comply with other Massachusetts’ statutes.

    Privacy, Cyber Risk & Data Security Massachusetts State Attorney General Artificial Intelligence UDAP CFPB

  • Kentucky enacts a comprehensive data privacy law for controllers

    Privacy, Cyber Risk & Data Security

    On April 4, Kentucky enacted HB 15 (the “Act”) which will apply to persons who conduct business that produces products or services that are targeted towards Kentucky residents. The Act will also apply to companies handling personal data of at least (i) 100,000 consumers, or (ii) 25,000 consumers and derive over 50 percent gross revenue from the sale of personal data. The Act does not apply to various entities, including: (i) city or state agencies, or political subdivisions of the state; (ii) financial institutions and their affiliates, as well as data subject to the Gramm-Leach-Bliley Act; (iii) covered entities or businesses governed by HIPAA regulations; and (iv) nonprofit organizations. Enforcement of the Act will be through Kentucky’s Attorney General.

    The Act will impose several requirements on controllers, including: (i) limiting collection of personal data to what is relevant and necessary for the disclosed purposes; (ii) implementing reasonable administrative, technical, and physical data security measures to safeguard the confidentiality, integrity, and accessibility of personal data; (iii) refraining from processing personal data for undisclosed purposes unless the consumer consents; and (iv) obtaining explicit consent before processing sensitive data, particularly from known children, in accordance with the Children’s Online Privacy Protection Act. Controllers will also need to conduct and document a data protection impact assessment for certain activities, such as targeted advertising, selling personal data, and profiling. Furthermore, controllers will be required to furnish consumers with a privacy notice containing information on the categories and purposes of data processing, consumer rights, appeals processes, and disclosures to third parties.

    The Act will grant consumers the right to confirm whether their personal data is being processed by a controller and to access that data, except where doing so would expose trade secrets. Also, consumers will have the right to rectify any inaccuracies, as well as the right to have their personal data deleted or to receive a copy of their personal data processed by the controller in a portable and easily usable format. This will allow transmission to another controller without impediment where processing is typically automated. Further, consumers will have the right to opt out of processing for targeted advertising, sale of personal data, or profiling for solely automated decisions with significant legal effects. Controllers must respond to consumer rights requests within 45 days and may be given another possible 45-day via an extension if necessary. Controllers and processors will be given a 30-day cure period during which they must confirm in writing that alleged violations have been rectified and pledge to prevent future breaches. The Act will go into effect January 1, 2026.

    Privacy, Cyber Risk & Data Security State Issues Kentucky Consumer Protection Gramm-Leach-Bliley

  • CFPB Director speaks on new and proposed rules for data brokers

    Agency Rule-Making & Guidance

    On April 2, the Director of the CFPB, Rohit Chopra, delivered a speech at the White House Office of Science and Technology Policy highlighting President Biden’s recent Executive Order (EO) to Protect Americans’ Sensitive Personal Data and how the CFPB will plan to develop rules to regulate “data brokers” under FCRA. As previously covered by InfoBytes, the EO ordered several agencies, including the CFPB, to better protect Americans’ data. Chopra highlighted how the EO not only covered data breaches but also regulated “data brokers” that ingest and sell data. According to the EO, “Commercial data brokers… can sell [data] to countries of concern, or entities controlled by those countries, and it can land in the hands of foreign intelligence services, militaries, or companies controlled by foreign governments.”

    Consistent with the EO, the CFPB will plan to propose rules this year that will regulate “data brokers,” as per its authority under FCRA. Specifically, the proposed rules would include data brokers within the definition of “consumer reporting agency”; further, a company’s sale of consumer payment or income data would be considered a “consumer report” subject to requirements, like accuracy, customer disputes, and other provisions prohibiting misuse of the data.

    Agency Rule-Making & Guidance Federal Issues CFPB Privacy, Cyber Risk & Data Security Executive Order Data Brokers

  • California regulator advises businesses to only collect needed data under CCPA

    Privacy, Cyber Risk & Data Security

    On April 2, The California Privacy Protection Agency issued Enforcement Advisory No. 2024-01 reminding businesses that data minimization is a foundational principle the California Consumer Privacy Act. The Advisory noted that the Agency has observed certain businesses collecting unnecessary and disproportionate amounts of personal information and emphasized that minimization principles would apply to processing consumer requests. As such, the Advisory highlighted the requirements of minimization, including the concept that the collection, use, sharing, and retention of personal information must be reasonable and proportionate to the purposes identified, considering the minimum personal information required, the potential negative impacts on consumers, and the existence of additional safeguards that addressed the applicable negative impacts. As part of the discussion, the Advisory also discussed two scenarios: one described an opt-out procedure, and the other described verification in connection with a consumer request. For the opt-out procedure, the Advisory reminded businesses that businesses may not verify a consumer’s identity to process an opt-out (it may, however, ask the consumer for the information necessary to complete the request). For the verification procedures, the Advisory outlined a possible process for analyzing whether additional verification information would be required, such as whether the business stores driver license information.  

    Privacy, Cyber Risk & Data Security California CCPA CPPA Digital Identity Identity Theft

  • New Hampshire enacts SB 255, a comprehensive consumer privacy bill

    State Issues

    Recently, the Governor of New Hampshire signed SB 255 (the “Act”) making New Hampshire the 14th state to enact a comprehensive consumer privacy bill. The Act will apply to entities that engage in commercial activities within New Hampshire or target New Hampshire consumers for their products or services and that during a one-year period either: (i) control or process data of 35,000 New Hampshire consumers (except solely for purposes of completing a payment transaction); or (ii) control or process data of 10,000 New Hampshire consumers and derive more than 25 percent of their revenue from selling the data. Exemptions include entities or data subject to the Gramm-Leach-Bliley Act’s Title V, non-profit organizations, and higher education institutions. The legislation will also exempt specific types of data, such as health information that is protected under HIPAA or data subject to the FCRA. The definition of consumer is limited to an individual residing in New Hampshire and excludes both employee and business-to-business (B2B) data.

    The Act will define new terms, such as "sensitive data” which could mean “personal data that includes data revealing racial or ethnic origin, religious beliefs, mental or physical health condition or diagnosis, sex life, sexual orientation or citizenship or immigration status.” “Sensitive data” also includes genetic or biometric information, data on children, and precise location details. New Hampshire will now mandate that companies obtain explicit consent from consumers before processing sensitive data.

    The Act also granted consumers the following rights: the right to know, the right to correct, the right to delete, the right to opt out of the processing of their personal data for targeted advertising, sales, or profiling of the consumer in furtherance of solely automated decisions that produce legal effects or other effects of similar significance, and the right to data portability.  Consumers will also be protected against discrimination for exercising any of the above rights.

    The Act contained controller responsibilities, including:

    • Limiting the collection of personal data to what is adequate, relevant and reasonably necessary;
    • not processing personal data for purposes that are neither reasonably necessary to, nor compatible with, the disclosed purposes that were disclosed to the consumer, unless the controller obtains the consumer's consent;
    • Establishing, implementing and maintaining reasonable administrative, technical and physical data security practices to protect the confidentiality, integrity and accessibility of personal data;
    • Not processing sensitive data concerning a consumer without obtaining the consumer's consent, or, in the case of the processing of sensitive data concerning a known child, without processing such data in accordance with COPPA;
    • Providing an effective mechanism for a consumer to revoke the consumer's consent that is at least as easy as the mechanism by which the consumer provided the consumer's consent and, upon revocation of such consent, ceasing to process the data as soon as practicable, but not later than 15 days after the receipt of such request; and
    • Not processing the personal data of a consumer for purposes of targeted advertising, or selling the consumer's personal data without the consumer's consent, under circumstances where a controller has actual knowledge, and willfully disregards, that the consumer is at least 13 years of age but younger than 16 years of age.

    The controller also must provide a privacy notice meeting the standards set forth by the Secretary of State. Controllers must conduct data protection assessments for each processing activity that presents a heightened risk of harm to a consumer, including: (i) the processing of personal data for the purpose of targeted advertising; (ii) the sale of personal data; (iii) the processing of sensitive data; and (iv) the processing of personal data for profiling, where profiling presents a reasonably foreseeable risk of unfair or deceptive treatment of consumers, unlawful disparate impact, or undue intrusion upon solitude or seclusion.

    The attorney general has exclusive authority to enforce the Act. Between January 1, 2025, and December 31, 2025, the attorney general is required to provide notice of an alleged violation and an accompanying 60-day cure period before commencing an enforcement action. Beginning January 1, 2026, the attorney general has the discretion to provide an opportunity to cure but is not required to provide such an opportunity. The Act does not include a private right of action. The Act will take effect on January 1, 2025.

    State Issues Privacy, Cyber Risk & Data Security New Hampshire State Legislation Consumer Protection

  • Utah enshrines two acts to create cybersecurity notification guidelines

    Privacy, Cyber Risk & Data Security

    On March 19, Utah enacted SB 98 which amended the state’s online data security and privacy requirements. SB 98 will include new protocols that individuals and governmental entities must follow under its data breach reporting requirements. SB 98 will require individuals and governmental entities to provide specific information about the breach, including, among other things: (i) when the data breach occurred; (ii) when the data breach was discovered; (iii) the total number of individuals affected by the breach, with a separate count for Utah residents; (iv) the type of personal data involved; (v) a brief description of the data breach; and only for government entities (vi) the path of means by which access was granted to the system if known; (vii) the individual or entity who perpetrated the breach if known; and (viii) the actions taken by the governmental entity to mitigate the effects of the breach. Additionally, the Cyber Center will be tasked with assisting the governmental entity in responding to breaches. This assistance may include: (a) conducting or participating in an internal investigation; (b) assisting law enforcement with their investigation if necessary; (c) determining the scope of the data breach; (d) helping the entity to restore the integrity of the compromised system; and (e) providing any other necessary support in response to the breach.

    On that same day, the governor also signed into law HB 491 which enacted the Government Data Privacy Act. Similarly, the bill will describe the duties of state government agencies related to personal data privacy, including breach notification requirements, limits on data collection and use, and the ability to correct and access personal data. On structure, the bill created the Utah Privacy Governing Board to recommend changes in the state privacy policy, established the Office of Data Privacy to coordinate implementation of privacy protections, and named the Personal Privacy Oversight Commission to the Utah Privacy Commission and amended the commission’s duties. Both SB 98 and HB 491 will go into effect on May 1.

    Privacy, Cyber Risk & Data Security State Issues State Legislation Data Breach Utah

  • Indiana enacts SB 220 on cyber incident notification guidelines

    State Issues

    On March 11, the Governor of Indiana signed SB 220 (the “Act”) which will add cyber incident notification guidelines for financial institutions. The Act defined the term "corporation" as the following entities organized in Indiana, including a (i) bank; (ii) trust company; (iii) corporate fiduciary; (iv) savings bank; (v) savings association; (vi) industrial loan and investment company with federal deposit insurance; (vii) credit union; and (viii) bank of discount and deposit.

    According to the Act, a corporation will be required to inform the director of the department about a reportable cyber incident or notification incident following the same protocol mandated by the corporation's federal regulatory body or deposit insurance provider. If a corporation does not have a federal regulatory body or deposit insurance provider, it must report the cyber incident to the director of the department using the procedures outlined in U.S.C. 12 CFR 748.1(c), which despite typically applying to federally insured credit unions, will also apply to corporations. The Act will go into effect on July 1. 

    State Issues State Legislation Privacy, Cyber Risk & Data Security Disclosures Indiana

  • EU Parliament becomes first to enact binding law on artificial intelligence

    Privacy, Cyber Risk & Data Security

    On March 13, the European Parliament of the European Union voted into law the world’s first binding law on artificial intelligence (AI) titled “Laying Down Harmonised Rules on Artificial Intelligence (Artificial Intelligence Act)” to put forth a flexible coordinated approach regarding the “human and ethical” implications of AI. The stated objectives of this new regulation comprise ensuring that AI systems are safe and respecting the existing laws that protect the fundamental rights of citizens. Further, the act will aim to make sure AI investments are legally sound, enforce AI laws effectively, and develop a single market for AI applications. The European Parliament stated its legal basis for this law is through Article 114 of the Treaty on the Functioning of the European Union.

    Following 89 preambles, the law’s Title I sets forth general subject matters: the law will set “harmonised rules” for the market as it will implement more AI systems; it prohibited certain AI systems; created specific requirements for “high-risk” AI systems; created transparency rules regarding emotion-recognition or biometric systems; and created rules on marketing surveillance. Title II prohibited AI practices and Title III covered “high-risk” AI systems. Prohibited AI systems include distorting someone’s behavior or vulnerabilities, evaluating trustworthiness with possibly a social score, or using “real-time” remote biometric identification systems for law enforcement (unless searching for either victims of a crime or missing children, among others). Title III on “high-risk” AI systems defined “high-risk” systems as those that pose fundamental risks to the rights of individuals, specifically to the health and safety of a citizen, among others.

    Privacy, Cyber Risk & Data Security Artificial Intelligence European Union

  • New Hampshire enshrines a new consumer privacy law

    Privacy, Cyber Risk & Data Security

    On March 6, the Governor of New Hampshire, Chris Sununu, signed into law a sweeping consumer privacy bill. Under the act, consumers will have the right to confirm if a controller (an individual who controls personal data) is processing their personal data, a right to access that data, as well as correct inaccuracies, obtain a copy, delete, and opt-out of the processing of the data for targeted advertising purposes. The act also imposed limits on collectors, including that a controller shall (i) limit the collection of data to only what is adequate, relevant, and reasonably necessary for the intended purpose; (ii) establish and maintain administrative security practices to protect the confidentiality of consumer personal data; (iii) not process sensitive data without obtaining the consumer’s consent or, if the data concerns a known child, process the data in accordance with COPPA; (iv) provide an easy means for consumers to revoke consent; and (v) not process personal data for targeted advertising purposes without consumer consent. The bill further outlined a processor’s responsibilities and required controllers to conduct a data protection assessment for each action that may present a risk of harm to a consumer. The act will go into effect on January 1, 2025.

    Privacy, Cyber Risk & Data Security State Issues New Hampshire State Legislation Opt-Out

Pages

Upcoming Events