Skip to main content
Menu Icon
Close

InfoBytes Blog

Financial Services Law Insights and Observations

Filter

Subscribe to our InfoBytes Blog weekly newsletter and other publications for news affecting the financial services industry.

  • CSBS announces Nonbank Model Data Security Law

    Privacy, Cyber Risk & Data Security

    The Conference of State Bank Supervisors (CSBS) recently released a comprehensive framework for safeguarding sensitive information held at nonbank financial institutions. CSBS’s Nonbank Model Data Security Law is largely based on the FTC’s updated Safeguards Rule, which added specific criteria for financial institutions and other entities, such as mortgage brokers, motor vehicle dealers, and payday lenders, to undertake when conducting risk assessments and implementing information security programs. (Covered by InfoBytes here.) Adopting the Nonbank Model Data Security Law allows for a streamlined and efficient approach to data security regulations for nonbank financial institutions, CSBS explained, adding that by leveraging the existing Safeguards Rule’s applicability to state covered nonbanks, the model law imposes minimal additional compliance burdens and ensures smoother implementation for financial institutions. States can also choose an alternative approach by requiring nonbank financial institutions to conform to the Safeguards Rule, CSBS said.

    The Nonbank Model Data Security Law outlines numerous provisions, which are intended to protect customer information, mitigate cyber threats, and foster a secure financial ecosystem. These include standards for safeguarding customer information, required elements that must be included in a nonbank financial institution’s information security program, and an optional section that requires entities to notify the commissioner in the wake of a security event. CSBS noted that because “the proposed rule on notification requirements for the FTC Safeguards Rule is still pending, the model law allows each state to establish their own customer threshold number, providing flexibility in determining the extent of impact that triggers the notification obligation.” CSBS also provided a list of resources for adopting the Nonbank Model Data Security Law.

    Privacy, Cyber Risk & Data Security State Issues CSBS Nonbank FTC Safeguard Rule Compliance

  • SEC adopts breach-reporting rules, establishes requirements for cybersecurity risk management

    Agency Rule-Making & Guidance

    On July 26, a divided SEC adopted a final rule outlining disclosure requirements for publicly traded companies in the event of a material cybersecurity incident. The final rule (proposed last year and covered by InfoBytes here) also requires companies to periodically disclose their cybersecurity risk management processes and establishes requirements for how cybersecurity disclosures must be presented. The final rule requires that material cybersecurity incidents be disclosed within four days from the time a company determines the incident was material (a disclosure may be delayed should the U.S. attorney general notify the SEC in writing that immediate disclosure poses a substantial risk to national security or public safety). Companies must also identify material aspects of the incident’s nature, scope, and timing, as well as its impact or reasonably likely impact on the company, and are required to describe their board’s and management’s oversight of risks from cybersecurity threats and previous cybersecurity incidents. These disclosures will be required in a company’s annual report. The final rule will also mandate foreign private issuers to provide comparable disclosures on forms related to material cybersecurity incidents and risk management, strategy, and governance.

    The final rule is effective 30 days following publication of the adopting release in the Federal Register. The SEC noted that incident-specific disclosures will be required in Forms 8-K and 6-K beginning either 90 days after the final rule’s publication in the Federal Register or on December 18, whichever is later, though smaller reporting companies are provided an extra 180 days before they must begin providing such disclosures. Annual disclosures on cyber risk management, strategy, and governance will be required in Form 10-K and Form 20-F reports starting with annual reports for fiscal years ending on or after December 15. In terms of structured data requirements, all companies must tag disclosures in the required format beginning one year after initial compliance with the related disclosure requirement.

    SEC Chair Gary Gensler commented that, in response to public comments received on the proposed rule, the final rule “streamlines required disclosures for both periodic and incident reporting” and requires companies “to disclose only an incident’s material impacts, nature, scope, and timing, whereas the proposal would have required additional details, not explicitly limited by materiality.”

    In voting against the final rule, Commissioner Hester M. Pierce raised concerns that the final rule’s compliance timelines are overly aggressive even for large companies and that the short incident disclosure period could potentially mislead otherwise uninformed investors and “lead to disclosures that are ‘tentative and unclear, resulting in false positives and mispricing in the market.’” The final rule allows a company to update its incident disclosure with new information in subsequent reports that was unavailable at first and could impact investors who may suffer a loss due to the mispricing of the company’s securities following the initial reporting, Pierce said. She also criticized the risk to national security or public safety exemption as being overly narrow. Commissioner Mark Uyeda also opposed the adoption, writing that “[n]o other Form 8-K event requires such broad forward-looking disclosure that needs to be constantly assessed for a potential amendment.” Uyeda also questioned whether “[p]remature public disclosure of a cybersecurity incident at one company could result in uncertainty of vulnerabilities at other companies, especially if it involves a commonly used technology provider, [thus] resulting in widespread panic in the market and financial contagion.”

    Agency Rule-Making & Guidance Federal Issues Securities Privacy, Cyber Risk & Data Security SEC Data Breach Risk Management

  • FTC, HHS say tracking technology may impermissibly disclose personal health data

    Privacy, Cyber Risk & Data Security

    On July 20, the FTC and U.S. Department of Health and Human Services for Civil Rights issued a joint letter cautioning hospitals and telehealth providers of the risks related to the use of online tracking technologies within their systems that may impermissibly disclose consumers’ personal data to third parties. Samuel Levine, Director of the FTC’s Bureau of Consumer Protection, said “when consumers visit a hospital’s website or seek telehealth services, they should not have to worry that their most private and sensitive health information may be disclosed to advertisers and other unnamed, hidden third parties.” According to the letter, recent research has highlighted concerns about the use of technology to track users’ online activities and sensitive data including, health conditions, diagnoses, medications, medical treatments, frequency of visits to health care professionals, and where an individual seeks medical treatment. The FTC warned that the impermissible disclosures of personal data can result in identity theft, financial loss, discrimination, and more. The letter included a reminder that under the FTC Act and the FTC Health Breach Notification Rule, even if they are not covered by HIPAA, hospitals and telehealth providers remain obligated to protect against impermissible disclosures of personal health information.

    Privacy, Cyber Risk & Data Security Federal Issues FTC FTC Act Consumer Protection Health Breach Notification Rule Department of Health and Human Services

  • E-commerce company fined $25 million for alleged COPPA violations

    Federal Issues

    On July 19, the DOJ and FTC announced that a global e-commerce tech company has agreed to pay a penalty for alleged privacy violations related to its smart voice assistant’s data collection and retention practices. The agencies sued the company at the end of May for violating the Children’s Online Privacy Protection Act Rule and the FTC Act, alleging it repeatedly assured users that they could delete collected voice recordings and geolocation information but actually held onto some of this information for years to improve its voice assistant’s algorithm, thus putting the data at risk of harm from unnecessary access. (Covered by InfoBytes here.)

    The stipulated order requires the company to pay a $25 million civil money penalty. The order also imposes injunctive relief requiring the company to (i) identify and delete any inactive smart voice assistant children’s accounts unless requested to be retained by a parent; (ii) notify parents whose children have accounts about updates made to its data retention and deletion practices and controls; (iii) cease making misrepresentations about its “retention, access to or deletion of geolocation information or voice information, including children’s voice information” and delete this information upon request of the user or parent; and (iii) disclose its geolocation and voice information retention and deletion practices to consumers. The company must also implement a comprehensive privacy program specific to its use of users’ geolocation information.

    Federal Issues Privacy, Cyber Risk & Data Security DOJ FTC Enforcement COPPA FTC Act Consumer Protection

  • FTC proposal would allow facial recognition for consent under COPPA

    Agency Rule-Making & Guidance

    On July 19, the FTC announced it is seeking public feedback on whether it should approve an application that proposes to create a new method for obtaining parental consent under the Children’s Online Privacy Protection Act (COPPA). The new method would involve analyzing a user’s facial geometry to confirm the individual’s age. Under COPPA, online sites and services directed to children under 13 are required to obtain parental consent before collecting or using a child’s personal information. COPPA provides a number of acceptable methods for obtaining parental consent but also allows interested parties to submit proposals for new verifiable parental consent methods to the FTC for approval.

    The application was submitted by a company that runs a COPPA safe harbor program, along with a digital identity company and a technology firm that helps companies comply with parental verification requirements. Specifically, the FTC’s request for public comment solicits feedback on several questions relating to the application, including: (i) whether the proposed age verification method is covered by existing methods; (ii) whether the proposed method meets COPPA’s requirements for parental consent (i.e., can the proposed method ensure that the person providing consent is the child’s parent); (iii) does the proposed method introduce a privacy risk to consumers’ personal information, including their biometric information; and (iv) does the proposed method “pose a risk of disproportionate error rates or other outcomes for particular demographic groups.” Comments are due 30 days after publication in the Federal Register.

    Agency Rule-Making & Guidance Federal Issues Privacy, Cyber Risk & Data Security Consumer Protection FTC COPPA

  • European Data Protection Board clarifies GDPR transfers

    Privacy, Cyber Risk & Data Security

    On July 18, the European Data Protection Board (EDPB) published an information note to provide clarity on data transfers under the GDPR to the United States following the European Commission’s adoption of the adequacy decision as part of the EU-U.S. Data Privacy Framework on July 10. The information note also addresses available redress mechanisms under the framework, as well as a new redress mechanism relating to the area of national security. As previously covered by InfoBytes, the European Commission concluded that the U.S. “ensures an adequate level of protection – comparable to that of the European Union – for personal data transferred from the EU to U.S. companies under the new framework.” With the adoption of the new adequacy decision, personal data can now be transferred securely from the EU to U.S. companies participating in the framework without having to implement additional data protection safeguards.

    The information note clarified that transfers based on adequacy decisions do not require supplementary measures. However, transfers to the U.S. not included in the “Data Privacy Framework List” will require appropriate safeguards, such as standard data protection clauses or binding corporate rules. The EDPB emphasized that U.S. government safeguards put in place in the area of national security (including the redress mechanism) will “apply to all data transfers to the [U.S.], regardless of the transfer tool used.” Additionally, EU individuals whose data is transferred to the U.S. based on the adequacy decision may use several redress mechanisms, including submitting complaints with the relevant U.S. organization, while EU organizations may seek advice from their national data protection authority to oversee related processing activities. Moreover, regardless of the transfer method used for sending personal data to the U.S., EU data subjects can submit complaints to their national data protection authority to utilize the new redress mechanism concerning national security. The national data protection authority, in turn, will ensure that the complaint is sent to the EDPB, which will transmit the complaint to the appropriate U.S. authorities.

    The EDPB noted that the European Commission will conduct a review of the adequacy decision one year after it enters into force to ensure all elements have been fully implemented and are effective. Depending on the findings, the European Commission will decide, in consultation with the EDPB and the EU member states, whether subsequent reviews are warranted.

    Privacy, Cyber Risk & Data Security Of Interest to Non-US Persons EU European Data Protection Board GDPR EU-US Data Privacy Framework

  • FTC fines company $7.8 million over health data and third-party advertisers

    Federal Issues

    On July 14, the FTC finalized an order against an online counseling service, requiring it to pay $7.8 million and prohibiting the sharing of consumers’ health data for advertising purposes. The FTC alleged that the respondent shared consumers’ sensitive health data with third parties despite promising to keep such information private (covered by InfoBytes here). The FTC said it will use the settlement funds to provide partial refunds to affected consumers. The order not only bans the respondent from disclosing health data for advertising and marketing purposes but also prohibits the sharing of consumers’ personal information for re-targeting. The order also stipulates that the respondent must now obtain consumers’ affirmative express consent before disclosing personal information, implement a comprehensive privacy program with certain data protection measures, instruct third parties to delete shared data, and adhere to a data retention schedule.

    Federal Issues Privacy, Cyber Risk & Data Security FTC Enforcement Consumer Protection Telehealth FTC Act Deceptive Advertisement Third-Party

  • Illinois Supreme Court declines to reconsider BIPA accrual ruling

    Privacy, Cyber Risk & Data Security

    On July 18, the Illinois Supreme Court declined to reconsider its February ruling, which held that under the state’s Biometric Information Privacy Act (BIPA or the Act), claims accrue “with every scan or transmission of biometric identifiers or biometric information without prior informed consent.” Three justices, however, dissented from the denial of rehearing, writing that the ruling leaves “a staggering degree of uncertainty” by offering courts and defendants little guidance on how to determine damages. The putative class action stemmed from allegations that the defendant fast food chain violated BIPA sections 15(b) and (d) by unlawfully collecting plaintiff’s biometric data and disclosing the data to a third-party vendor without first obtaining her consent. While the defendant challenged the timeliness of the action, the plaintiff asserted that “a new claim accrued each time she scanned her fingerprints” and her data was sent to a third-party authenticator, thus “rendering her action timely with respect to the unlawful scans and transmissions that occurred within the applicable limitations period.”

    In February, a split Illinois Supreme Court held that claims accrue under BIPA each time biometric identifiers or biometric information (such as fingerprints) are scanned or transmitted, rather than simply the first time. (Covered by InfoBytes here.) The dissenting judges wrote that they would have granted rehearing because the majority’s determination that BIPA claims accrue with every transmission “subvert[s] the intent of the Illinois General Assembly, threatens the survival of businesses in Illinois, and consequently raises significant constitutional due process concerns.” The dissenting judges further maintained that the majority’s February decision is confusing and lacks guidance for courts when determining damages awards. While the majority emphasized that BIPA does not contain language “suggesting legislative intent to authorize a damages award that would result in the financial destruction of a business,” it also said that it continues “to believe that policy-based concerns about potentially excessive damage awards under [BIPA] are best addressed by the legislature,” and that it “respectfully suggest[s] that the legislature review these policy concerns and make clear its intent regarding the assessment of damages under [BIPA].”

     

    Privacy, Cyber Risk & Data Security Courts State Issues Illinois BIPA Enforcement Consumer Protection Class Action

  • Oregon is 11th state to enact comprehensive privacy legislation

    Privacy, Cyber Risk & Data Security

    On July 18, the Oregon governor signed SB 619 (the Act) to establish a framework for controlling and processing consumer personal data in the state. Oregon follows California, Colorado, Connecticut, Virginia, Utah, Iowa, Indiana, Tennessee, Montana, and Texas in enacting comprehensive consumer privacy measures. Last month, Florida also enacted privacy legislation, but the requirements focus on specific digital controllers with global gross annual revenues of more than $1 billion.

    Highlights of the Act include:

    • Applicability. The Act applies to persons conducting business or producing products or services intentionally directed at Oregon residents that either control or process personal data of more than 100,000 consumers per calendar year (“other than personal data controlled or processed solely for the purpose of completing a payment transaction”) or earn 25 percent or more of their gross revenue from the sale of personal data and process or control the personal data of 25,000 consumers or more. Additionally, the Act provides several exemptions, including financial institutions and their affiliates, data governed by the Gramm-Leach-Bliley Act and certain other federal laws, nonprofit organizations, and protected health information processed by a covered entity in compliance with the Health Insurance Portability and Accountability Act, among others. The Act does not apply to personal information collected in the context of employment or business-to-business relationships.
    • Consumer rights. Under the Act, consumers will be able to access their personal data, make corrections, request deletion of their data, and obtain a copy of their data in a portable format. Consumers will also be able to opt out of the processing of personal information for targeted advertising, the sale of personal information, or profiling “in furtherance of decisions that produce legal effects or effects of similar significance.” Data controllers also will be required to obtain a consumer’s consent to process sensitive personal information or, in the case of a known child, obtain consent from the child’s parent or lawful guardian. Additionally, the Act requires opt-in consent for using the personal data of a youth 13 to 15 years old for targeted advertising or profiling. The Act makes clear that consent means “an affirmative act by means of which a consumer clearly and conspicuously communicates the consumer’s freely given, specific, informed and unambiguous assent to another person’s act or practice.” This does not include the use of an interface “that has the purpose or substantial effect of obtaining consent by obscuring, subverting or impairing the consumer’s autonomy, decision-making or choice.” Controllers that receive a consent revocation from a consumer must process the revocation within 15 days.
    • Controller responsibilities. Among the Act’s requirements, data controllers will be responsible for (i) responding to consumer requests within 45 days after receiving a request (a 45-day extension may be granted when reasonably necessary upon notice to the consumer); (ii) providing clear and meaningful privacy notices; (iii) disclosing to consumers when their personal data is sold to third parties or processed for targeted advertising, and informing consumers how they may opt out; (iv) limiting the collection of data to what is adequate, relevant, and reasonably necessary for a specified purpose and securing personal data from unauthorized access; (v) conducting and retaining data protection assessments where there is a heightened risk of harm and ensuring deidentified data cannot be associated with a consumer; and (vi) avoiding unlawful discrimination.
    • Data processing agreements. The Act stipulates that processors must follow a controller’s instructions and help meet the controller’s obligations concerning the processing of personal data. The Act also sets forth obligations relating to contracts between a controller and a processor. Processors that engage a subcontractor must ensure the subcontractor meets the processor’s obligations with respect to personal data under the processor’s contract with the controller. 
    • Private right of action and state attorney general enforcement. The Act does not provide a private right of action to consumers. Instead, the Oregon attorney general may investigate violations and seek civil penalties of no more than $7,500 per violation. Before initiating such action, the attorney general may grant the controller 30 days to cure the violation. 

    The Act takes effect July 1, 2024.

    Privacy, Cyber Risk & Data Security State Issues State Legislation Oregon Consumer Protection

  • Washington releases FAQs for My Health My Data Act

    Privacy, Cyber Risk & Data Security

    On June 20, the Washington attorney general published a series of Frequently Asked Questions (FAQs) related to the My Health My Data Act—a comprehensive health privacy law that provides broad restrictions on the use of consumer health data (covered by InfoBytes here). The FAQs include information on the law’s effective dates and applicability. According to the AG, “all persons, as defined in the Act, must comply with section 10 beginning July 23, 2023. Regulated entities that are not small businesses must comply with sections 4 through 9 beginning March 31, 2024. Small businesses, as defined in the Act, must comply with sections 4 through 9 beginning June 30, 2024. For sections 4 through 9, the effective dates apply to the entirety of the section and are not limited to the subsections in which the effective dates appear.” Additionally, the FAQs clarify that a business that is covered by the Act must provide a link to its consumer health data privacy policy on its homepage.

    The FAQs also address a potential conflict between Sections 6 and 9 of the Act regarding the right to delete and consumers’ authorizations to sell data, respectively. Section 9 mandates that any person, not just regulated entities, must obtain consumer authorization before selling or offering to sell their data. Both the seller and purchaser are required to retain a copy of the authorization, which may contain consumer health data for  six years. However, Section 6 stipulates that consumer health data should be deleted from a regulated entity’s network upon the consumer’s request. The FAQs advise that in cases where a consumer requests deletion under Section 6, any authorizations stored under Section 9 must be redacted to eliminate any information related to the data that was sold.

    Privacy, Cyber Risk & Data Security State Issues Washington Consumer Protection Medical Data State Attorney General

Pages

Upcoming Events