Skip to main content
Menu Icon
Close

InfoBytes Blog

Financial Services Law Insights and Observations

Filter

Subscribe to our InfoBytes Blog weekly newsletter and other publications for news affecting the financial services industry.

  • Agencies highlight downpayment assistance, child privacy in regulatory agendas

    Agency Rule-Making & Guidance

    Recently, the Office of Information and Regulatory Affairs released fall 2022 regulatory agendas for the FTC and HUD. With respect to an FTC review of the Children’s Online Privacy Protection Rule (COPPA) that was commenced in 2019 (covered by InfoBytes here), the Commission stated in its regulatory agenda that it is still reviewing comments. COPPA “prohibits unfair or deceptive acts or practices in connection with the collection, use and/or disclosure of personal information from and about children under the age of 13 on the internet,” and, among other things, “requires operators of commercial websites and online services, with certain exceptions, to obtain verifiable parental consent before collecting, using, or disclosing personal information from or about children.”

    HUD stated in its regulatory agenda that it anticipates issuing a notice of proposed rulemaking in March that would address mortgage downpayment assistance programs. The Housing and Economic Recovery Act of 2018 amended the National Housing Act to add a clause that prohibits any portion of a borrower’s required minimum cash investment from being provided by: “(i) the seller or any other person or entity that financially benefits from the transaction, or (ii) any third party or entity that is reimbursed, directly or indirectly, by any of the parties described in clause (i).” According to the agenda, FHA continues to receive questions about prohibitions on persons or entities that may financially benefit from a mortgage transaction, including “whether down payment assistance programs operated by government entities are being operated in a fashion that would render such assistance prohibited.” A future NPRM would clarify the circumstances in which government entities are deriving a prohibited financial benefit.

    Agency Rule-Making & Guidance Federal Issues FTC HUD COPPA Downpayment Assistance Mortgages Privacy, Cyber Risk & Data Security Consumer Protection FHA

  • Social media users denied preliminary injunction in privacy suit

    Courts

    On December 22, the U.S. District Court for the Northern District of California denied plaintiffs’ motion for preliminary injunction in a privacy suit. According to the order, the plaintiffs alleged that the social media company improperly acquired their confidential health information in violation of state and federal law and in contravention of the company’s own policies regarding the use and collection of users’ data. The plaintiffs alleged that each of their healthcare providers allegedly installed the company’s software, which is a free and publicly available piece of code that the company allows third-party website developers to install on their patient portals. When the plaintiffs logged into the portal on their medical provider’s website, the software allegedly transmitted certain information to the social media company. The plaintiffs claimed that the software allowed the company to intercept personally identifiable medical information and the content of patient communications for its financial gain. The court found,however, that though the plaintiffs “raise potentially strong claims on the merits and their alleged injury would be irreparable if proven,” the “plaintiffs need to show ‘that the law and facts clearly favor [their] position, not simply that [they are] likely to succeed.’” The court also noted that the company’s “core defense is that it has systems in place to address the receipt of the information at issue and that it would be unfairly burdensome and technologically infeasible for them to take further action.” The court continued, “[w]ithout further factual development, it is unclear where the truth lies, and plaintiffs do not meet the high standard required for a mandatory injunction.”

    Courts Privacy, Cyber Risk & Data Security Consumer Protection

  • FCC proposes new data breach notification requirements

    Agency Rule-Making & Guidance

    On January 6, the FCC announced a notice of proposed rulemaking (NPRM) to launch a formal proceeding for strengthening the Commission’s rules for notifying customers and federal law enforcement of breaches of customer proprietary network information (CPNI). FCC Chairwoman Jessica Rosenworcel noted that “given the increase in frequency, sophistication, and scale of data leaks, we must update our rules to protect consumers and strengthen reporting requirements.” She commented that the “new proceeding will take a much-needed, fresh look at our data breach reporting rules to better protect consumers, increase security, and reduce the impact of future breaches.” The NPRM, which seeks to improve alignment with recent developments in federal and state data breach laws covering other sectors, would require telecommunications providers to notify impacted customers of CPNI breaches without unreasonable delay, thus eliminating the current seven business day mandatory waiting period for notifying customers of a breach.

    Among other things, the FCC requests feedback on whether to establish a specific timeframe (e.g. a requirement to report breaches of customers’ data within 24 or 72 hours of discovery of a breach) or whether a disclosure deadline should vary based on a graduated scale of severity. The FCC also seeks comments on whether a carrier should “be held to have ‘reasonably determined’ a breach has occurred when it has information indicating that it is more likely than not that there was a breach,” and whether the Commission should publish guidance on what constitutes a reasonable determination or adopt a more definite standard. Feedback is also solicited on topics such as threshold triggers, what should be included in a security breach notification, the delivery method of these notifications, and whether to expand the definition of a data breach to also include inadvertent disclosures. Comments are due 30 days after publication in the Federal Register.

    Agency Rule-Making & Guidance Privacy, Cyber Risk & Data Security FCC Data Breach Consumer Protection

  • France fines software company €60 million for data violations

    Privacy, Cyber Risk & Data Security

    In December, the French data protection agency, Commission Nationale de l’Informatique et des Libertés (CNIL), imposed a €60 million penalty against a global software development company accused of making it harder for users of its search engine to reject cookies than to accept them. Based on investigations conducted in September 2020 and May 2021, CNIL claims that when users visited the search engine, cookies used for advertising purposes and countering advertising fraud, among other things, were automatically deposited on their terminal without the users’ consent. Under French law, these types of cookies may only be deposited after users have expressed their consent, according to CNIL. CNIL further observed that while the search engine offered a button to accept cookies immediately, it did not offer an equivalent button to allow the user to refuse the cookies as easily. By making the refusal mechanism more complex, users are discouraged from refusing cookies and are instead encouraged “to prefer the ease of the consent button in the first window,” CNIL said, adding that “such a procedure infringed the freedom of consent of Internet users.” Claiming violations of Article 82 of the French Data Protection Act, CNIL ordered the company to take measures within three months to modify its practices for obtaining consent from users residing in France. CNIL further stated that additional fines of €60,000 will be imposed per day of non-compliance following the end of the three-month period. 

    Privacy, Cyber Risk & Data Security Of Interest to Non-US Persons France Enforcement Consumer Protection Cookies

  • Irish DPC fines global social media company €390 million over targeted ads

    Privacy, Cyber Risk & Data Security

    On January 4, the Irish Data Protection Commission (DPC) announced the conclusion of two inquiries into the data processing practices of a global social media company’s European operations. Collectively, the DPC imposed fines totaling €390 million against the company for allegedly requiring users to accept targeted ads when accepting the company’s social media platform terms of service. Complaints were raised in 2018 by data subjects in Austria and Belgium, claiming that the company violated the GDPR by conditioning access to its services on users’ acceptance of the company’s updated terms of service, thereby “forcing” them to consent to the processing of their personal data for behavioral advertising and other personalized services. The company maintained that once a user accepted the updated terms of service, a contract was formed, and that processing user data in connection with the delivery of its social media services was necessary for the performance of that contract (including the provision of personalized services and behavioral advertising). According to the company, “such processing operations were lawful by reference to Article 6(1)(b) of the GDPR (the ‘contract’ legal basis for processing).”

    The DPC issued draft decisions, finding that (i) the company breached its transparency obligations because the “contract” legal basis for processing was not clearly disclosed to users, but that, (ii) in principle, the GDPR did not preclude the company’s reliance on such basis.

    In accordance with the GDPR, the draft decisions were submitted to DPC’s EU peer regulators (Concerned Supervisory Authorities or “CSAs”). Regarding the question of whether the company had acted in contravention of its transparency obligations, the CSAs agreed with the DPC’s decisions but concluded that higher fines should be imposed. Ten of the 47 CSAs, however, concluded that the company “should not be permitted to rely on the contract legal basis on the grounds that the delivery of personalized advertising . . . could not be said to be necessary to perform the core elements of what was said to be a much more limited form of contract.” The DPC disagreed, arguing that personalized advertising is “central to the bargain struck between users and their chosen service provider” as part of the contract that is established when a user accepts the terms of service. The dispute was referred to the European Data Protection Board (EDPB) after the regulators were unable to reach a consensus.

    The EDPC determined that, “as a matter of principle,” the company “is not entitled to rely on the ‘contract’ legal basis as providing a lawful basis for its processing of personal data for the purpose of behavioral advertising.” The DPC adopted the EDPC’s determination and issued final decisions, finding, among other things, that the company’s processing of users’ data in purported reliance on the “contract” legal basis amounts to a contravention of Article 6 of the GDPR. The decisions require the company to bring its processing operations into compliance with the GDPR within a three-month period and impose administrative fines higher than those originally proposed, in line with the EDPC’s direction to increase the fines.

    The company released a statement following the decisions. According to the company, “[t]here has been a lack of regulatory clarity on this issue, and the debate among regulators and policymakers around which legal bases are most appropriate in a given situation has been ongoing for some time. This issue is also currently being debated by the highest courts in the EU, who may yet reach a different conclusion altogether.” The company added that “we strongly disagree with the DPC’s final decision, and believe we fully comply with GDPR by relying on Contractual Necessity for behavioural ads given the nature of our services. As a result, we will appeal the substance of the decision. Given that regulators themselves disagreed with each other on this issue up until the final stage of these processes in December, it is hard to understand how we can be criticised for the approach we have taken to date, and therefore we also plan to challenge the size of the fines imposed.”

    Privacy, Cyber Risk & Data Security Of Interest to Non-US Persons EU GDPR Enforcement

  • Colorado releases second draft of Colorado Privacy Act rules

    Privacy, Cyber Risk & Data Security

    On December 21, the Colorado attorney general released a second set of draft rules for the Colorado Privacy Act (CPA). As previously covered by a Buckley Special Alert, the CPA was enacted in July 2021 to establish a framework for personal data privacy rights. The CPA, which is effective July 1, 2023 with certain opt-out provisions taking effect July 1, 2024, provides consumers with numerous rights, including the right to access their personal data, opt-out of certain uses of personal data, make corrections to personal data, request deletion of personal data, and obtain a copy of personal data in a portable format. Under the CPA, the AG has enforcement authority for the law, which does not have a private right of action. The AG also has authority to promulgate rules to carry out the requirements of the CPA and issue interpretive guidance and opinion letters, as well as the authority to develop technical specifications for at least one universal opt-out mechanism. The first set of draft rules was issued last September and published by the Secretary of State on October 10 (covered by InfoBytes here).

    The second set of draft rules seeks to address concerns raised through public comments as well as feedback received during three stakeholder sessions. The AG seeks specific input on questions related to (i) clarifications to definitions; (ii) the use of IP addresses to verify consumer opt-out requests; (iii) implementation of a universal opt-out mechanism; (iv) controller obligations related to meaningful privacy notices; and (v) bona fide loyalty programs. Among other things, the modifications would:

    • Clarify definitions. The modifications add, delete, and amend several definitions, including those related to “biometric identifiers,” “commercial product or service,” “controller,” “employee,” “employer,” “employment records,” “noncommericial purpose,” “personal data,” “process,” “processor,” “profiling,” and terms involving automated processing.
    • Amend purpose-based privacy notices. The modifications remove the requirement that privacy notices be purpose-based, and will instead require that the processing purpose and type of personal data processed be connected in a way that provides consumers a meaningful understanding of how their personal data will be used. The AG seeks feedback on ways the draft rules can “be made interoperable with California’s privacy notice requirements, while still considering the CPA’s purpose specification, secondary use requirements, and ensuring that a consumer has a meaningful understanding of the way their personal data will be used when they interact with a controller.” Feedback is also requested on whether controllers “who have updated their privacy policies to comply with California’s privacy notice requirements anticipate making a separate policy for Colorado, updating a California specific privacy notice to include Colorado or other state requirements, or revising the main privacy policy/notice to meet Colorado and other non-California state requirements[.]”
    • Update universal opt-out mechanism. The modifications grant controllers six months from the date a universal opt-out mechanism is recognized by the AG to begin complying with that new mechanism. An initial public list of approved opt-out mechanisms will be published no later than January 1, 2024, and will be updated periodically.
    • Clarify security measures and duty of care. The modifications provide additional details about the duty to safeguard personal data, and will require controllers to, among other things, consider “[a]pplicable industry standards and frameworks,” and the sensitivity, amount, and original source of the personal data when identifying reasonable and appropriate safeguards. The modifications also include provisions related to the processing of sensitive data inferences and specifies deletion requirements.
    • Reduce data protection assessment requirements. The modifications reduce the information that must be included in a controller’s data protection assessment.
    • Clarify privacy notice changes. The modifications clarify when a controller must notify a consumer of “substantive or material” changes to its data processing that trigger updates to its privacy notice. The modifications emphasize that disclosure of a new processing purpose in a privacy policy alone does not constitute valid consent.
    • Address refreshing of consumer consent. The modifications provide that consumer consent must be refreshed when a consumer has not interacted with the controller in the last 12 months, and (i) the controller is processing sensitive personal information; or (ii) is processing personal data for secondary data use that involves profiling for a decision that could result “in the provision or denial of financial or lending services, housing, insurance, education enrollment or opportunity, criminal justice, employment opportunities, health-care services, or access to essential goods or services.” However, controllers will not be required to refresh consent in situations where consumers have the ability to update their own opt-out preferences at any time.

    Comments on the second set of draft rules are due February 1. If the formal rulemaking hearing on the proposed rules (scheduled for February 1) extends beyond that date, comments must be received on or before the last day of the hearing.

    Privacy, Cyber Risk & Data Security State Issues State Attorney General Colorado Colorado Privacy Act Agency Rule-Making & Guidance

  • California privacy agency holds public meeting on CPRA

    Privacy, Cyber Risk & Data Security

    On December 16, the California Privacy Protection Agency (CPPA) Board held a public meeting to discuss the ongoing status of the California Privacy Rights Act (CPRA). As previously covered by InfoBytes, the CPRA (largely effective January 1, 2023, with enforcement delayed until July 1, 2023) was approved by ballot measure in November 2020 to amend and build on the California Consumer Privacy Act (CCPA). In July, the CPPA initiated formal rulemaking procedures to adopt proposed regulations implementing the CPRA, and in November the agency posted updated draft regulations (covered by InfoBytes here and here). The CPPA stated it anticipates conducting additional preliminary rulemaking in early 2023. After public input is received, the CPPA will discuss proposed regulatory frameworks for risk assessments, cybersecurity audits, and automated decisionmaking.

    During the board meeting, the CPPA introduced sample questions and subject areas for preliminary rulemaking that will be provided to the public at some point in 2023, and finalized and approved at a later meeting. The questions and topics relate to, among other things, (i) privacy and security risk assessment requirements, including whether the CPPA should follow the approach outlined in the European Data Protection Board’s Guidelines on Data Protection Impact Assessment, as well as other models or factors the agency should consider; (ii) benefits and drawbacks for businesses should the CPPA accept a business’s risk assessment submission that was completed in compliance with GDPR’s or the Colorado Privacy Act’s requirements for these assessments; (iii) how the CPPA can ensure cybersecurity audits, assessments, and evaluations are thorough and independent; and (iv) how to address profiling and logic in automated decisionmaking, the prevalence of algorithmic discrimination, and whether opt-out rights with respect to a business’s use of automated decisionmaking technology differ across industries and technologies. The CPPA said it is also considering different rules for businesses making under $25 million in annual gross revenues.

    Privacy, Cyber Risk & Data Security State Issues California CPPA CPRA CCPA Consumer Protection Agency Rule-Making & Guidance

  • FSOC annual report highlights digital asset, cybersecurity, and climate risks

    Federal Issues

    On December 16, the Financial Stability Oversight Council (FSOC or the Council) released its 2022 annual report. The report reviewed financial market developments, identified emerging risks, and offered recommendations to mitigate threats and enhance financial stability. The report noted that “amid heightened geopolitical and economic shocks and inflation, risks to the U.S. economy and financial stability have increased even as the financial system has exhibited resilience.” The report also noted that significant unaddressed vulnerabilities could potentially disrupt institutions’ ability to provide critical financial services, including payment clearings, liquidity provisions, and credit availability to support economic activity. FSOC identified 14 specific financial vulnerabilities and described mitigation measures. Highlights include:

    • Nonbank financial intermediation. FSOC expressed support for initiatives taken by the SEC and other agencies to address investment fund risks. The Council encouraged banking agencies to continue monitoring banks’ exposure to nonbank financial institutions, including reviewing how banks manage their exposure to leverage in the nonbank financial sector.
    • Digital assets. FSOC emphasized the importance of enforcing existing rules and regulations applicable to the crypto-asset ecosystem, but commented that there are gaps in the regulation of digital asset activities. The Council recommended that legislation be enacted to grant rulemaking authority to the federal banking agencies over crypto-assets that are not securities. The Council said that regulatory arbitrage needs to be addressed as crypto-asset entities offering services similar to those offered by traditional financial institutions do not have to comply with a consistent or comprehensive regulatory framework. FSOC further recommended that “Council members continue to build capacities related to data and the analysis, monitoring, supervision, and regulation of digital asset activities.”
    • Climate-related financial risks. FSOC recommended that state and federal agencies should continue to work to advance appropriately tailored supervisory expectations for regulated entities’ climate-related financial risk management practices. The Council encouraged federal banking agencies “to continue to promote consistent, comparable, and decision-useful disclosures that allow investors and financial institutions to consider climate-related financial risks in their investment and lending decisions.”
    • Treasury market resilience. FSOC recommended that member agencies review Treasury’s market structure and liquidity challenges, and continue to consider policies “for improving data quality and availability, bolstering the resilience of market intermediation, evaluating expanded central clearing, and enhancing trading venue transparency and oversight.” 
    • Cybersecurity. FSOC stated it supports partnerships between state and federal agencies and private firms to assess cyber vulnerabilities and improve cyber resilience. Acknowledging the significant strides made by member agencies this year to improve data collection for managing cyber risk, the Council encouraged agencies to continue gathering any additional information needed to monitor and assess cyber-related financial stability risks. 
    • LIBOR transition. FSOC recommended that firms should “take advantage of any existing contractual terms or opportunities for renegotiation to transition their remaining legacy LIBOR contracts before the publication of USD LIBOR ends.” The Council emphasized that derivatives and capital markets should continue transitioning to the Secured Overnight financing Rate.

    CFPB Director Rohit Chopra issued a statement following the report’s release, flagging risks posed by the financial sector’s growing reliance on big tech cloud service providers. “Financial institutions are looking to move more data and core services to the cloud in coming years,” Chopra said. “The operational resilience of these large technology companies could soon have financial stability implications. A material disruption could one day freeze parts of the payments infrastructure or grind other critical services to a halt.” Chopra also commented that FSOC should determine next year whether to grant the agency regulatory authority over stablecoin activities under Dodd-Frank. He noted that “[t]hrough the stablecoin inquiry, it has become clear that nonbank peer-to-peer payments firms serving millions of American consumers could pose similar financial stability risks” as these “funds may not be protected by deposit insurance and the failure of such a firm could lead to millions of American consumers becoming unsecured creditors of the bankruptcy estate, similar to the experience with [a now recently collapsed crypto exchange].”

    Federal Issues Digital Assets CFPB FSOC Nonbank Department of Treasury Climate-Related Financial Risks Privacy, Cyber Risk & Data Security LIBOR SOFR Fintech

  • Gaming company to pay $520 million to resolve FTC allegations

    Federal Issues

    On December 19, the DOJ filed a complaint on behalf of the FTC against a video game developer for allegedly violating the Children’s Online Privacy Protection Act (COPPA) by failing to protect underage players’ privacy. The FTC also alleged in a separate administrative complaint that the company employed “dark patterns” to trick consumers into making unwanted in-game purchases, thus allowing players to accumulate unauthorized charges without parental involvement. (See also FTC press release here.)

    According to the complaint filed in the U.S. District Court for the Eastern District of North Carolina, the company allegedly collected personal information from players under the age of 13 without first notifying parents or obtaining parents’ verifiable consent. Parents who requested that their children’s personal information be deleted allegedly had to take unreasonable measures, the FTC claimed, and the company sometimes failed to honor these requests. The company is also accused of violating the FTC Act’s prohibition against unfair practices when its settings enabled, by default, real-time voice and text chat communications for children and teens. These default settings, as well as a matching system that enabled children and teens to be matched with strangers to play the game, exposed players to threats, harassment, and psychologically traumatizing issues, the FTC maintained. While company employees expressed concerns about the default settings and players reported concerns, the FTC said that the company resisted turning off the default setting and made it difficult for players to figure out how to turn the voice chat off when the FTC did eventually take action.

    Under the terms of a proposed court order filed by the DOJ, the company would be prohibited from enabling voice and text communications unless parents (of players under the age of 13) or teenage users (or their parents) provide affirmative consent through a privacy setting. The company would also be required to delete players’ information that was previously collected in violation of COPPA’s parental notice and consent requirements unless it obtains parental consent to retain such data or the player claims to be 13 or older through a neutral age gate. Additionally, the company must implement a comprehensive privacy program to address the identified violations, maintain default privacy settings, and obtain regular, independent audits. According to the DOJ’s announcement, the company has agreed to pay $275 million in civil penalties—the largest amount ever imposed for a COPPA violation.

    With respect to the illegal dark patterns allegations, the FTC claimed that the company used a variety of dark patterns, such as “counterintuitive, inconsistent, and confusing button configuration[s],” designed to get players of all ages to make unintended in-game purchases. These tactics caused players to pay hundreds of millions of dollars in unauthorized charges, the FTC said, adding that the company also charged account holders for purchases without authorization. Players were able to purchase in-game content by pressing buttons without requiring any parental or card holder action or consent. Additionally, the company allegedly blocked access to purchased content for players who disputed unauthorized charges with their credit card companies, and threatened players with a lifetime ban if they disputed any future charges. Moreover, cancellation and refund features were purposefully obscured, the FTC asserted.

    To resolve the unlawful billing practices, the proposed administrative order would require the company to pay $245 million in refunds to affected players. The company would also be prohibited from charging players using dark patterns or without obtaining their affirmative consent. Additionally, the order would bar the company from blocking players from accessing their accounts should they dispute unauthorized charges.

    Federal Issues FTC DOJ Enforcement Privacy, Cyber Risk & Data Security COPPA FTC Act Unfair UDAP Consumer Finance Dark Patterns

  • FINRA alerts firms about rising ransomware risks

    Privacy, Cyber Risk & Data Security

    On December 14, FINRA issued Regulatory Notice 22-29, alerting member firms about the increasing number and sophistication of ransomware incidents. FINRA explained that the proliferation in ransomware attacks can be attributed in part to the increased use of technology and continued adoption of cryptocurrencies that bad actors use to conceal their identities when collecting ransom payments. Moreover, bad actors who purchase attack services on the dark web “have helped execute attacks on a much larger scale and make attacks available to less technologically savvy bad actors,” FINRA said. Under Rule 30 of the SEC’s Regulation S-P, firms are required to maintain written policies and procedures designed to reasonably safeguard customer records and information, FINRA stated, adding that FINRA Rule 4370 (related to business continuity plans and emergency contact information) also applies to ransomware attacks that include service denials and other interruptions to firms’ operations. The notice provides questions for firms to consider when evaluating their cybersecurity programs and outlines common attack types and considerations for firms’ ransomware threat defenses, as well as additional ransomware controls and relevant resources.

    Privacy, Cyber Risk & Data Security FINRA Ransomware Digital Assets Cryptocurrency SEC

Pages

Upcoming Events