Skip to main content
Menu Icon
Close

InfoBytes Blog

Financial Services Law Insights and Observations

Filter

Subscribe to our InfoBytes Blog weekly newsletter and other publications for news affecting the financial services industry.

  • E-commerce company fined $25 million for alleged COPPA violations

    Federal Issues

    On July 19, the DOJ and FTC announced that a global e-commerce tech company has agreed to pay a penalty for alleged privacy violations related to its smart voice assistant’s data collection and retention practices. The agencies sued the company at the end of May for violating the Children’s Online Privacy Protection Act Rule and the FTC Act, alleging it repeatedly assured users that they could delete collected voice recordings and geolocation information but actually held onto some of this information for years to improve its voice assistant’s algorithm, thus putting the data at risk of harm from unnecessary access. (Covered by InfoBytes here.)

    The stipulated order requires the company to pay a $25 million civil money penalty. The order also imposes injunctive relief requiring the company to (i) identify and delete any inactive smart voice assistant children’s accounts unless requested to be retained by a parent; (ii) notify parents whose children have accounts about updates made to its data retention and deletion practices and controls; (iii) cease making misrepresentations about its “retention, access to or deletion of geolocation information or voice information, including children’s voice information” and delete this information upon request of the user or parent; and (iii) disclose its geolocation and voice information retention and deletion practices to consumers. The company must also implement a comprehensive privacy program specific to its use of users’ geolocation information.

    Federal Issues Privacy, Cyber Risk & Data Security DOJ FTC Enforcement COPPA FTC Act Consumer Protection

  • FTC proposal would allow facial recognition for consent under COPPA

    Agency Rule-Making & Guidance

    On July 19, the FTC announced it is seeking public feedback on whether it should approve an application that proposes to create a new method for obtaining parental consent under the Children’s Online Privacy Protection Act (COPPA). The new method would involve analyzing a user’s facial geometry to confirm the individual’s age. Under COPPA, online sites and services directed to children under 13 are required to obtain parental consent before collecting or using a child’s personal information. COPPA provides a number of acceptable methods for obtaining parental consent but also allows interested parties to submit proposals for new verifiable parental consent methods to the FTC for approval.

    The application was submitted by a company that runs a COPPA safe harbor program, along with a digital identity company and a technology firm that helps companies comply with parental verification requirements. Specifically, the FTC’s request for public comment solicits feedback on several questions relating to the application, including: (i) whether the proposed age verification method is covered by existing methods; (ii) whether the proposed method meets COPPA’s requirements for parental consent (i.e., can the proposed method ensure that the person providing consent is the child’s parent); (iii) does the proposed method introduce a privacy risk to consumers’ personal information, including their biometric information; and (iv) does the proposed method “pose a risk of disproportionate error rates or other outcomes for particular demographic groups.” Comments are due 30 days after publication in the Federal Register.

    Agency Rule-Making & Guidance Federal Issues Privacy, Cyber Risk & Data Security Consumer Protection FTC COPPA

  • Feds, states launch “Operation Stop Scam Calls”

    Federal Issues

    On July 18, the FTC, along with over 100 federal and state law enforcement partners nationwide, including the DOJ, FCC, and attorneys general from all 50 states and the District of Columbia, announced a new initiative to combat illegal telemarketing calls, including robocalls. The joint initiative, “Operation Stop Scam Calls,” targets telemarketers and the companies that hire them, lead generators that provide consumers’ telephone numbers to robocallers and others who falsely represent that consumers consented to receive the calls. The initiative also targets Voice over Internet Protocol (VoIP) service providers that facilitate illegal robocalls, many of which originate overseas.

    In connection with Operation Stop Scam Calls, the FTC has initiated five new cases against companies and individuals allegedly responsible for distributing or assisting in the distribution of illegal telemarketing calls to consumers across the country. According to the announcement, the actions reiterate the FTC’s position “that third-party lead generation for robocalls is illegal under the Telemarketing Sales Rule (TSR) and that the FTC and its partners are committed to stopping illegal calls by targeting anyone in the telemarketing ecosystem that assists and facilitates these calls, including VoIP service providers.” The announcement also states that more than 180 enforcement actions and other initiatives have been taken by 48 federal and 54 state agencies as part of Operation Stop Scam Calls.

    Among the new actions announced a part of Operation Stop Scam Calls is a complaint filed against a “consent farm” lead generator, which allegedly uses “dark patterns” to collect consumers’ broad agreement to provide their personal information and receive robocalls and other marketing solicitations through a single click of a button or checkbox via its websites. Under the terms of the proposed order, the defendant would be required to pay a $2.5 million civil penalty and would be banned from engaging in, assisting, or facilitating robocalls. The defendant would also be required to implement measures to limit its lead generation practices, establish systems for monitoring its own advertising and that of its affiliates, comply with comprehensive disclosure requirements concerning the collection of consumers’ consent to the sale of their information, and delete all previously collected consumer information.

    Other actions were taken against a California-based telemarketing lead generator, a telemarketing company that provides soundboard calling services to clients who use robocalls to sell a range of products and services, a New Jersey-based telemarketing outfit that placed tens of millions of calls to consumers whose numbers are listed on the National Do Not Call Registry, and Florida-based defendants accused of assisting and facilitating the transmission of roughly 37.8 million illegal robocalls by providing VoIP services to over 11 foreign telemarketers.

    Federal Issues State Issues Courts FTC Enforcement Robocalls Consumer Protection State Attorney General TSR Telemarketing Lead Generation DOJ FCC

  • FTC fines company $7.8 million over health data and third-party advertisers

    Federal Issues

    On July 14, the FTC finalized an order against an online counseling service, requiring it to pay $7.8 million and prohibiting the sharing of consumers’ health data for advertising purposes. The FTC alleged that the respondent shared consumers’ sensitive health data with third parties despite promising to keep such information private (covered by InfoBytes here). The FTC said it will use the settlement funds to provide partial refunds to affected consumers. The order not only bans the respondent from disclosing health data for advertising and marketing purposes but also prohibits the sharing of consumers’ personal information for re-targeting. The order also stipulates that the respondent must now obtain consumers’ affirmative express consent before disclosing personal information, implement a comprehensive privacy program with certain data protection measures, instruct third parties to delete shared data, and adhere to a data retention schedule.

    Federal Issues Privacy, Cyber Risk & Data Security FTC Enforcement Consumer Protection Telehealth FTC Act Deceptive Advertisement Third-Party

  • Illinois Supreme Court declines to reconsider BIPA accrual ruling

    Privacy, Cyber Risk & Data Security

    On July 18, the Illinois Supreme Court declined to reconsider its February ruling, which held that under the state’s Biometric Information Privacy Act (BIPA or the Act), claims accrue “with every scan or transmission of biometric identifiers or biometric information without prior informed consent.” Three justices, however, dissented from the denial of rehearing, writing that the ruling leaves “a staggering degree of uncertainty” by offering courts and defendants little guidance on how to determine damages. The putative class action stemmed from allegations that the defendant fast food chain violated BIPA sections 15(b) and (d) by unlawfully collecting plaintiff’s biometric data and disclosing the data to a third-party vendor without first obtaining her consent. While the defendant challenged the timeliness of the action, the plaintiff asserted that “a new claim accrued each time she scanned her fingerprints” and her data was sent to a third-party authenticator, thus “rendering her action timely with respect to the unlawful scans and transmissions that occurred within the applicable limitations period.”

    In February, a split Illinois Supreme Court held that claims accrue under BIPA each time biometric identifiers or biometric information (such as fingerprints) are scanned or transmitted, rather than simply the first time. (Covered by InfoBytes here.) The dissenting judges wrote that they would have granted rehearing because the majority’s determination that BIPA claims accrue with every transmission “subvert[s] the intent of the Illinois General Assembly, threatens the survival of businesses in Illinois, and consequently raises significant constitutional due process concerns.” The dissenting judges further maintained that the majority’s February decision is confusing and lacks guidance for courts when determining damages awards. While the majority emphasized that BIPA does not contain language “suggesting legislative intent to authorize a damages award that would result in the financial destruction of a business,” it also said that it continues “to believe that policy-based concerns about potentially excessive damage awards under [BIPA] are best addressed by the legislature,” and that it “respectfully suggest[s] that the legislature review these policy concerns and make clear its intent regarding the assessment of damages under [BIPA].”

     

    Privacy, Cyber Risk & Data Security Courts State Issues Illinois BIPA Enforcement Consumer Protection Class Action

  • Oregon is 11th state to enact comprehensive privacy legislation

    Privacy, Cyber Risk & Data Security

    On July 18, the Oregon governor signed SB 619 (the Act) to establish a framework for controlling and processing consumer personal data in the state. Oregon follows California, Colorado, Connecticut, Virginia, Utah, Iowa, Indiana, Tennessee, Montana, and Texas in enacting comprehensive consumer privacy measures. Last month, Florida also enacted privacy legislation, but the requirements focus on specific digital controllers with global gross annual revenues of more than $1 billion.

    Highlights of the Act include:

    • Applicability. The Act applies to persons conducting business or producing products or services intentionally directed at Oregon residents that either control or process personal data of more than 100,000 consumers per calendar year (“other than personal data controlled or processed solely for the purpose of completing a payment transaction”) or earn 25 percent or more of their gross revenue from the sale of personal data and process or control the personal data of 25,000 consumers or more. Additionally, the Act provides several exemptions, including financial institutions and their affiliates, data governed by the Gramm-Leach-Bliley Act and certain other federal laws, nonprofit organizations, and protected health information processed by a covered entity in compliance with the Health Insurance Portability and Accountability Act, among others. The Act does not apply to personal information collected in the context of employment or business-to-business relationships.
    • Consumer rights. Under the Act, consumers will be able to access their personal data, make corrections, request deletion of their data, and obtain a copy of their data in a portable format. Consumers will also be able to opt out of the processing of personal information for targeted advertising, the sale of personal information, or profiling “in furtherance of decisions that produce legal effects or effects of similar significance.” Data controllers also will be required to obtain a consumer’s consent to process sensitive personal information or, in the case of a known child, obtain consent from the child’s parent or lawful guardian. Additionally, the Act requires opt-in consent for using the personal data of a youth 13 to 15 years old for targeted advertising or profiling. The Act makes clear that consent means “an affirmative act by means of which a consumer clearly and conspicuously communicates the consumer’s freely given, specific, informed and unambiguous assent to another person’s act or practice.” This does not include the use of an interface “that has the purpose or substantial effect of obtaining consent by obscuring, subverting or impairing the consumer’s autonomy, decision-making or choice.” Controllers that receive a consent revocation from a consumer must process the revocation within 15 days.
    • Controller responsibilities. Among the Act’s requirements, data controllers will be responsible for (i) responding to consumer requests within 45 days after receiving a request (a 45-day extension may be granted when reasonably necessary upon notice to the consumer); (ii) providing clear and meaningful privacy notices; (iii) disclosing to consumers when their personal data is sold to third parties or processed for targeted advertising, and informing consumers how they may opt out; (iv) limiting the collection of data to what is adequate, relevant, and reasonably necessary for a specified purpose and securing personal data from unauthorized access; (v) conducting and retaining data protection assessments where there is a heightened risk of harm and ensuring deidentified data cannot be associated with a consumer; and (vi) avoiding unlawful discrimination.
    • Data processing agreements. The Act stipulates that processors must follow a controller’s instructions and help meet the controller’s obligations concerning the processing of personal data. The Act also sets forth obligations relating to contracts between a controller and a processor. Processors that engage a subcontractor must ensure the subcontractor meets the processor’s obligations with respect to personal data under the processor’s contract with the controller. 
    • Private right of action and state attorney general enforcement. The Act does not provide a private right of action to consumers. Instead, the Oregon attorney general may investigate violations and seek civil penalties of no more than $7,500 per violation. Before initiating such action, the attorney general may grant the controller 30 days to cure the violation. 

    The Act takes effect July 1, 2024.

    Privacy, Cyber Risk & Data Security State Issues State Legislation Oregon Consumer Protection

  • Washington releases FAQs for My Health My Data Act

    Privacy, Cyber Risk & Data Security

    On June 20, the Washington attorney general published a series of Frequently Asked Questions (FAQs) related to the My Health My Data Act—a comprehensive health privacy law that provides broad restrictions on the use of consumer health data (covered by InfoBytes here). The FAQs include information on the law’s effective dates and applicability. According to the AG, “all persons, as defined in the Act, must comply with section 10 beginning July 23, 2023. Regulated entities that are not small businesses must comply with sections 4 through 9 beginning March 31, 2024. Small businesses, as defined in the Act, must comply with sections 4 through 9 beginning June 30, 2024. For sections 4 through 9, the effective dates apply to the entirety of the section and are not limited to the subsections in which the effective dates appear.” Additionally, the FAQs clarify that a business that is covered by the Act must provide a link to its consumer health data privacy policy on its homepage.

    The FAQs also address a potential conflict between Sections 6 and 9 of the Act regarding the right to delete and consumers’ authorizations to sell data, respectively. Section 9 mandates that any person, not just regulated entities, must obtain consumer authorization before selling or offering to sell their data. Both the seller and purchaser are required to retain a copy of the authorization, which may contain consumer health data for  six years. However, Section 6 stipulates that consumer health data should be deleted from a regulated entity’s network upon the consumer’s request. The FAQs advise that in cases where a consumer requests deletion under Section 6, any authorizations stored under Section 9 must be redacted to eliminate any information related to the data that was sold.

    Privacy, Cyber Risk & Data Security State Issues Washington Consumer Protection Medical Data State Attorney General

  • California probes employers’ CCPA compliance

    Privacy, Cyber Risk & Data Security

    On July 14, the California attorney general announced it recently sent inquiries to several large employers as part of an investigation into companies’ compliance with their legal obligations under the California Consumer Protection Act (CCPA). The investigation centers on how companies handle the personal information of employees and job applicants. As previously covered by InfoBytes, temporary exemptions related to human resource and business-to-business data provided by the CCPA and the California Privacy Rights Act expired on January 1 of this year. Amendments were introduced last legislative session that would have extended the exemption for “personal information that is collected and used by a business solely within the context of having an emergency contact on file, administering specified benefits, or a person’s role . . . [in] that business.” The amendments also proposed extending certain exemptions related to “personal information reflecting a communication or a transaction between a business and a company, partnership, sole proprietorship, nonprofit, or government agency that occurs solely within the context of the business conducting due diligence or providing or receiving a product or service.” However, the amendments were not adopted, and the exemptions expired.

    The AG said they are sending the inquiry letters “to learn how employers are complying with their legal obligations.” Covered businesses subject to the CCPA are required to comply with the statute’s privacy protections as they relate to employee data, including providing notice of privacy practices and honoring consumer requests to exercise their rights to access, delete, and opt out of the sale and sharing of their personal information.

    Privacy, Cyber Risk & Data Security State Issues California State Attorney General CCPA Consumer Protection

  • Agencies charge crypto platform and former executives

    Federal Issues

    On July 13, the FTC announced a proposed settlement to resolve allegations that a crypto platform engaged in unfair and deceptive acts or practices in violation of the FTC Act. The FTC also alleges that the defendants violated the Gramm-Leach-Bliley Act by acquiring customer information from a financial institution regarding someone else by providing false or misleading statements. The New Jersey-based crypto company offers various cryptocurrency products and services to customers, such as interest-bearing accounts, personal loans backed by cryptocurrency deposits, and a cryptocurrency exchange. On the heels of its bankruptcy filing in July 2022, the FTC lodged a complaint in federal court alleging that three former executives falsely promised that deposits would be “safer” than bank deposits and always available for withdrawal, and that the platform posed “no risk” or “minimal risk.”

    The proposed stipulated order imposes a $4.72 million judgment against the corporate defendants, which is suspended based on their financial condition. The order also bans the corporate defendants from, among other things, “advertising, marketing, promoting, offering, or distributing, or assisting in the advertising, marketing, promoting, offering, or distributing of any product or service that can be used to deposit, exchange, invest, or withdraw assets, whether directly or through an intermediary.” 

    Other agencies also took action against the company and its former CEO on the same day, including the SEC, which alleges the company sold unregistered crypto asset securities in one of its program offerings. The SEC’s complaint further alleges the company made false and misleading statements and engaged in market manipulation. Additionally, the DOJ unsealed an indictment charging the former CEO and the company’s former chief revenue officer with conspiracy, securities fraud, market manipulation, and wire fraud for illicitly manipulating the price of the company’s token. Additionally, the CFTC filed a civil complaint charging the company and former CEO with fraud and material misrepresentations in connection with the operation of the company’s digital asset-based finance platform. The CFTC alleges the company operated as an unregistered commodity pool operator (CPO), and its former CEO operated as an unregistered associated person of a CPO. The complaint also accuses the former CEO of violating the Commodity Exchange Act and CFTC regulations, among other things. According to the press release, the company agreed to resolve the complaint, while the former CEO is continuing litigation.

    Federal Issues Digital Assets Securities Fintech Cryptocurrency FTC FTC Act Gramm-Leach-Bliley Enforcement Consumer Protection Deceptive SEC CFTC DOJ

  • 11th Circuit orders reexamination of breach class boundaries

    Privacy, Cyber Risk & Data Security

    On July 11, a split U.S. Court of Appeals for the Eleventh Circuit partially vacated the greenlighting of two data breach class actions, holding that a district court must re-analyze the boundaries of the classes. Both the nationwide and California classes are individuals who sued a restaurant chain after their card data and personally identifiable information were compromised in a cyberattack. Plaintiffs claimed that information for roughly 4.5 million cards could be accessed on an online marketplace for stolen payment information. Two of the three named plaintiffs also said they experienced unauthorized charges on their accounts. Plaintiffs moved to certify two classes seeking both injunctive and monetary relief—a nationwide (or alternatively a statewide) class for negligence and a California class for claims based on the state’s unfair business practices laws. The district court certified a nationwide class and a separate California-only class. The restaurant chain’s parent company appealed, arguing that the certification violates court precedent on Article III standing for class actions, that the classes do not meet the commonality requirements for certification, and that the district court erred by finding that a common damages methodology existed for the class.

    On appeal, the majority found that at the class certification stage, plaintiffs only had to show that a reliable damages methodology existed. The majority also determined that the district court correctly found that plaintiffs’ expert presented a sufficient methodology for calculating damages and that “it would be a ‘matter for the jury’ to decide actual damages at trial.” However, the majority remanded the case with instructions for the district court to clarify what it meant when it certified classes of individuals who had their “data accessed by cybercriminals.” According to the opinion, the district court meant for this term to encompass individuals who experienced fraudulent charges or whose credit card information was posted on the dark web. The majority expressed concerns that the phrase “accessed by cybercriminals” is broader than the two delineated categories provided by the district court and could include individuals who had their data taken but were otherwise uninjured. The majority also vacated the California class certification after determining that two of the three named plaintiffs lacked standing because they dined at the restaurant outside of the “at-risk” timeframe. The district court’s damages calculation methodology, however, was left undisturbed by the appellate court.  

    Partially dissenting, one of the judges wrote that while she agreed that one of the named plaintiffs had standing to sue, she disagreed with the majority’s concrete injury analysis. The judge also argued that the district court erred in its damage calculations by “impermissibly permit[ting] plaintiffs to receive an award based on damages that they did not suffer.”

    Privacy, Cyber Risk & Data Security Courts State Issues California Appellate Eleventh Circuit Consumer Protection Class Action Data Breach

Pages

Upcoming Events