Skip to main content
Menu Icon
Close

InfoBytes Blog

Financial Services Law Insights and Observations

Filter

Subscribe to our InfoBytes Blog weekly newsletter and other publications for news affecting the financial services industry.

  • New Hampshire enshrines a new consumer privacy law

    Privacy, Cyber Risk & Data Security

    On March 6, the Governor of New Hampshire, Chris Sununu, signed into law a sweeping consumer privacy bill. Under the act, consumers will have the right to confirm if a controller (an individual who controls personal data) is processing their personal data, a right to access that data, as well as correct inaccuracies, obtain a copy, delete, and opt-out of the processing of the data for targeted advertising purposes. The act also imposed limits on collectors, including that a controller shall (i) limit the collection of data to only what is adequate, relevant, and reasonably necessary for the intended purpose; (ii) establish and maintain administrative security practices to protect the confidentiality of consumer personal data; (iii) not process sensitive data without obtaining the consumer’s consent or, if the data concerns a known child, process the data in accordance with COPPA; (iv) provide an easy means for consumers to revoke consent; and (v) not process personal data for targeted advertising purposes without consumer consent. The bill further outlined a processor’s responsibilities and required controllers to conduct a data protection assessment for each action that may present a risk of harm to a consumer. The act will go into effect on January 1, 2025.

    Privacy, Cyber Risk & Data Security State Issues New Hampshire State Legislation Opt-Out

  • CPPA releases latest draft of automated decision-making technology regulation

    State Issues

    The California Privacy Protection Agency (CPPA) released an updated draft of its proposed enforcement regulations for automated decisionmaking technology in connection with its March 8 board meeting. The draft regulations included new definitions, including “automated decisionmaking technology” which means “any technology that processes personal information and uses computation to execute a decision, replace human decisionmaking, or substantially facilitate human decisionmaking,” which expands its scope from its previous September update (covered by InfoBytes here).

    Among other things, the draft regulations would require businesses that use automated decisionmaking technology to provide consumers with a “Pre-use Notice” to inform consumers on (i) the business’s use of the technology; (ii) their right to opt-out of the business’s use of the automated decisionmaking technology and how they can submit such a request (unless exempt); (iii) a description of their right to access information; and (iv) a description of how the automated decisionmaking technology works, including its intended content and recommendations and how the business plans to use the output. The draft regulations detailed further requirements for the opt-out process.

    The draft regulations also included a new article, entitled “risk assessments,” which provided requirements as to when a business must conduct certain assessments and requirements that process personal information to train automated decisionmaking technology or artificial intelligence. Under the proposed regulations, every business which processes consumers’ personal information may present significant risk to consumers’ privacy and must conduct a risk assessment before initiating that processing. If a business previously conducted a risk assessment for a processing activity in compliance with the article and submitted an abridged risk assessment to the CPPA, and there were no changes, the business is not required to submit an updated risk assessment. The business must, however, submit a certification of compliance to the CPPA.

    The CPPA has not yet started the formal rulemaking process for these regulations and the drafts are provided to facilitate board discussion and public participation, and are subject to change. 

    State Issues Privacy Agency Rule-Making & Guidance California CPPA Artificial Intelligence

  • House Committee report finds broad financial surveillance by federal government using financial institutions data following January 6th events

    Privacy, Cyber Risk & Data Security

    On March 5, the Committee on the Judiciary and its Select Subcommittee on the Weaponization of the Federal Government released an interim staff report on how federal law enforcement agencies, in the wake of the events of January 6, 2021, at the U.S. Capitol, engaged in financial surveillance by encouraging financial institutions to provide data on private transactions of consumers without a nexus to criminal conduct. The report indicated the consumers particularly targeted were those who tend to hold “conservative viewpoints.” The report cited several whistleblower testimonies and provided email transcripts of the government agents’ requests. One institution allegedly acted “voluntarily and without legal process” and provided the FBI with a dataset of names of those who used that institution’s credit or debit card in the Washington, D.C. region between January 5 and January 7, 2021, but also included those who had ever used that institution’s debit or credit card to purchase a firearm. The report suggested that citizens who did nothing other than go “shopping or exerciz[e] their Second Amendment rights” were placed under a type of financial surveillance between their financial institution and the government, making specific mention of right-leaning individuals now at risk.

    The report provided context with the Right to Financial Privacy Act of 1978, Section 314(a) of the USA Patriot Act, and the Bank Secrecy Act in mind. While these federal acts were created to protect citizens, the report alleged they “have failed to adequately protect American’s financial information.” The report was particularly critical of the federal government using “informal meetings and backchannel discussions” with financial institutions to devise the best methods for getting Americans’ private financial information, including using merchant category codes and politicized “search terms,” and the federal government disseminating “political materials” to such institutions that were allegedly “hostile” to conservative viewpoints and “treated lawful transactions as suspicious.”

    Privacy, Cyber Risk & Data Security House Judiciary Committee Banking Bank Secrecy Act

  • FCC partners with two U.K. regulators in combating privacy issues and protecting consumer data

    Privacy, Cyber Risk & Data Security

    Recently, the FCC announced (here and here) that it has partnered with two U.K. communications regulatory agencies to address issues regarding privacy and data protection in telecommunications. The FCC announced two separate statements because the two U.K. regulators perform different duties: the first announcement is with the U.K. Information Commissioner’s Office (ICO), which regulates data protection and information rights; the second is with the U.K.’s Office of Communications (OFCOM) which regulates telecommunications. Both announcements highlighted a strengthening of resources and networks to protect consumers on an international scale, given the large amounts of data shared via international telecom carriers.

    The FCC’s announcement with ICO explained that the partnership would be focused on combatting robocall and robotext efforts, as well as finding means to better protect consumer privacy and data concerns. In the FCC’s announcement with the OFCOM, the U.S. regulator announced a new collaboration to combat illegal robocalls and robotexts given the two countries’ shared interest in investigating networking abuses. The FCC elaborated on its desire to bolster requirements for gateway providers: this is the “on-ramp” for international internet traffic into U.S. networks. 

    Privacy, Cyber Risk & Data Security FCC UK Of Interest to Non-US Persons Privacy Data Protection

  • White House orders DOJ and CFPB to better protect citizens’ sensitive personal data

    Privacy, Cyber Risk & Data Security

    On March 1, the White House released Executive Order 14117 (E.O.) titled “Preventing Access to Americans’ Bulk Sensitive Personal Data and United States Government-Related Data by Countries of Concern” to issue safeguards against Americans’ private information. The E.O. was preceded by the White House’s Fact Sheet which included provisions to protect Americans’ data on their genomic and biometric information, personal health, geolocation, finances, among others. The E.O. shared how this data can be used by nefarious actors such as foreign intelligence services or companies and could enable privacy violations. Under the E.O., President Biden ordered several agencies to act but primarily called on the DOJ. The president directed the DOJ to issue regulations on protecting Americans’ data from being exploited by certain countries. The White House also directed the DOJ to issue regulations to protect government-related data, specifically citing protections for geolocation information and information about military members. Lastly, the DOJ was directed to work with DHS to prevent certain countries’ access to citizens’ data through commercial means and the CFPB was encouraged to “[take] steps, consistent with CFPB’s existing legal authorities, to protect Americans from data brokers that are illegally assembling and selling extremely sensitive data, including that of U.S. military personnel.”

    A few days before, the DOJ released its fact sheet detailing its proposals to implement the White House’s E.O., focusing on national security risks and data security. The fact sheet highlighted that our current laws leave open lawful access to vast amounts of Americans’ sensitive personal data that may be purchased and accessed through commercial relationships. In response to the E.O., the DOJ plans to release future regulations “addressing transactions that involve [Americans’] bulk sensitive data” that pose a risk of access by countries of concern. The countries of concern include China (including Hong Kong and Macau), Russia, Iran, North Korea, Cuba, and Venezuela. The DOJ will also release its Advance Notice of Proposed Rulemaking (ANPRM) to provide details of the proposal(s) and to solicit comments.

    Privacy, Cyber Risk & Data Security Federal Issues Department of Justice CFPB Executive Order Department of Homeland Security White House Big Data China Russia Iran North Korea Cuba Venezuela

  • FTC alleges a common enterprise’s software misrepresented consumers’ sensitive browsing data

    Federal Issues

    On February 22, the FTC released a complaint and decision against multiple software companies operating as a common enterprise for allegedly violating three counts of Section 5 of the FTC Act for (1) unfairly collecting consumers’ browsing information; (2) deceptively failing to disclose tracking of consumers; and (3) stating false representations on data aggregation and anonymization. From 2014 to 2020, the FTC alleged that the companies distributed software with several privacy claims including that the software would block cookies and prevent browser tracking without obtaining consumers’ consent and deceiving consumers about the true nature of their actions.

    The FTC alleged the companies collected browser information through browser extensions and antivirus software. While the companies claimed that these extensions provided security and privacy services, the companies used the extensions to collect browser information from users including URLs of visited webpages, URLs of background resources (e.g., cookies or images pulled from other domains), consumers’ search queries, and cookie values. While the companies made claims about the privacy and security of their products, they failed to disclose to consumers that their browsing information was sold to third parties and misrepresented how the data was shared. This browsing information can comprise sensitive data, possibly revealing a consumer’s religious beliefs, health information, political ideology, location, finances, and “interests in prurient content.” The FTC noted that when the companies in 2019 asked software users to opt-in to collect browser information, less than 50% of consumers agreed.

    Under the FTC’s Decision, the companies must pay $16.5 million in monetary relief. Additionally, the FTC enjoined the companies from licensing or selling any browsing data from branded products to third parties for advertising purposes, and the companies are required to (a) obtain consent from consumers before selling consumers’ browsing data from non-branded products for advertising; (b) delete consumer web browsing information and certain products or algorithms derived from that data; (c) notify consumers whose information was previously sold without their consent; and (d) implement a privacy program.

    Federal Issues Data Consumer Data Privacy, Cyber Risk & Data Security

  • California Attorney General settles with food delivery company for allegedly violating two state privacy acts

    Privacy, Cyber Risk & Data Security

    On February 21, the California State Attorney General Office announced its complaint against a food delivery company for allegedly violating the California Consumer Privacy Act of 2018 (CCPA) and the California Online Privacy Protection Act of 2003 (CalOPPA) for failing to provide consumers notice or an opportunity to opt-out of the sale.

    The CCPA requires businesses that sell personal information to make specific disclosures and give consumers the right to opt out of the sale. Under the CCPA, a company must disclose a privacy policy and post an “easy-to-find ‘Do Not Sell My Personal Information’ link.” The California AG alleged that the company provided neither notice. The AG also alleged that the company violated CalOPPA by not making required privacy policy disclosures. The company’s existing disclosures indicated that the company could only use customer data to present someone with advertisements, but not give that information to other businesses to use.

    The proposed stipulated judgment, if approved by a court, will require the company to pay a $375,000 civil money penalty, and to (i) comply with CCPA and CalOPPA requirements; (ii) review contracts with vendors to evaluate how the company is sharing personal information; and (iii) provide annual reports to the AG on potential sales or sharing personal information.

    Privacy, Cyber Risk & Data Security California State Attorney General CCPA CalOPPA Enforcement Data

  • California appeals court vacates a ruling on enjoining enforcement of CPRA regulations

    State Issues

    On February 9, California’s Third District Court of Appeal vacated a lower court’s decision to enjoin the California Privacy Protection Agency (CPPA) from enforcing regulations implementing the California Privacy Rights Act (CPRA).  The decision reverses the trial court’s ruling delaying enforcement of the regulations until March 2024, which would have given businesses a one-year implementation period from the date final regulations were promulgated (covered by InfoBytes here).

    The CPRA mandated the CPPA to finalize regulations on specific elements of the act by July 1, 2022, and provided that “the Agency’s enforcement authority would take effect on July 1, 2023,” a one-year gap between promulgation and enforcement. The CPPA did not issue final regulations until March of 2023, but sought to enforce the rules starting on the July 1, 2023, statutory date.  In response, in March 2023, the Chamber of Commerce filed a lawsuit in state court seeking a one-year delay of enforcement for the new regulations.  The trial court held that a delay was warranted because “voters intended there to be a gap between the passing of final regulations and enforcement of those regulations.” On appeal, the court emphasized that there is no explicit and unambiguous language in the law prohibiting the agency from enforcing the CPRA until at least one year after final regulations are approved, and that and found that while the mandatory dates included in the CPRA “amounts to a one-year delay,” such a delay was not mandated by the statutory language. The court further found that there is no indication from the ballot materials available to voters in passing the statute that the voters intended such a one-year delay. The court explained that the one-year gap between regulations could have been interpreted to give businesses time to comply, or as a period for the agency to prepare for enforcing the new rules, or there may also be other reasons for the gap.

    Accordingly, the appellate court held that Chamber of Commerce “was simply not entitled to the relief granted by the trial court.” As a result of the court’s decision, businesses are now required to commence implementing the privacy regulations established by the agency. 

    State Issues Privacy Courts California Appellate CPPA CPRA

  • District Court dismisses FDCPA class action lawsuit for lack of standing on alleged concrete injuries suffered

    Courts

    On January 31, the U.S. District Court for the Eastern District of New York dismissed an FDCPA class action lawsuit for lack of standing. According to the order, plaintiff alleged numerous violations of the FDCPA related to two debt collection letters sent to the plaintiff and his girlfriend. In September 2023, a debt collector (defendant) reportedly sent two letters to the plaintiff which allegedly did not contain the requisite information mandated by the FDCPA for communication with consumers, including validation and itemization details. One of the letters purportedly demanded payment by September 29, falling within the 30-day validation period. Additionally, plaintiff asserted that one of the letters was addressed to his girlfriend who bore no responsibility for the debt. Plaintiff claimed two concrete injuries: (i) the letters allegedly strained his relationship with his girlfriend, causing emotional distress; and (ii) due to the omission of critical information in the letters, plaintiff felt confused and uncertain about how to effectively respond.  

    In considering the plaintiff’s claims, the court discussed the elements required to state a claim for publicity given to private life and examines a specific case where such a claim was rejected by the court. It highlights that for such a claim to succeed, the matter publicized must be highly offensive to a reasonable person and not of legitimate public concern. Additionally, mere communication of private information to a single person typically does not constitute publicity, unless it has the potential to become public knowledge. Although Congress explicitly prohibits debt collectors from sharing consumer financial information with third parties, the court noted that it “does not automatically transform every arguable invasion of privacy into an actionable, concrete injury.” Therefore, the plaintiff's injury, as pleaded, was deemed insufficiently concrete for standing purposes. Regarding the second alleged injury, the court argued that confusion alone does not suffice as a concrete injury for standing purposes, and courts have determined that mere confusion or frustration does not qualify as an injury. Additionally, the court compared the case to other cases where plaintiffs had alleged confusion yet had also demonstrated further injuries.

    Courts FDCPA Class Action Consumer Finance Litigation Standing Debt Collection

  • Connecticut Attorney General reports on Connecticut Data Privacy Act

    State Issues

    On February 1, Connecticut’s Attorney General (AG) released a report on the Connecticut Data Privacy Act (CTDPA) including information on the law and how the state enforces it. Enacted in May 2022, the CTDPA is a comprehensive consumer data privacy law which took effect on July 1, 2023. The CTDPA gives consumers in Connecticut a set of rights regarding their personal information and privacy standards for businesses handling such data. Connecticut residents can: (i) see what data companies have on them; (ii) ask for corrections on inaccurate information; (iii) request the deletion of their data; and (iv) choose not to have their personal information used for selling products, targeted advertisements, or profiling. The report noted that within the first six months the CTDPA has been in effect, the AG issued dozens of violations towards a number of information requests. It added that companies generally responded positively to the notices and updated quickly their privacy policies and consumer rights mechanisms. According to the report, while some companies initially went below the CTDPA threshold, they made changes to meet it later while a few went beyond identified areas in the notices by strengthening their disclosures. 

    The report also mentioned that beginning on January 1, 2025, businesses are required to acknowledge universal opt-out signals, reflecting consumers’ choice to opt out of targeted advertising and the sale of personal data. This mandatory provision was emphasized during Connecticut's legislative process to alleviate the consumer burden, and it has been enacted into law. Finally, the report discusses possible expansions and clarifications to the CTDPA for the legislature to consider.  

    State Issues Connecticut State Attorney General Privacy, Cyber Risk & Data Security

Pages

Upcoming Events