Skip to main content
Menu Icon
Close

InfoBytes Blog

Financial Services Law Insights and Observations

Filter

Subscribe to our InfoBytes Blog weekly newsletter and other publications for news affecting the financial services industry.

  • Gaming company to pay $520 million to resolve FTC allegations

    Federal Issues

    On December 19, the DOJ filed a complaint on behalf of the FTC against a video game developer for allegedly violating the Children’s Online Privacy Protection Act (COPPA) by failing to protect underage players’ privacy. The FTC also alleged in a separate administrative complaint that the company employed “dark patterns” to trick consumers into making unwanted in-game purchases, thus allowing players to accumulate unauthorized charges without parental involvement. (See also FTC press release here.)

    According to the complaint filed in the U.S. District Court for the Eastern District of North Carolina, the company allegedly collected personal information from players under the age of 13 without first notifying parents or obtaining parents’ verifiable consent. Parents who requested that their children’s personal information be deleted allegedly had to take unreasonable measures, the FTC claimed, and the company sometimes failed to honor these requests. The company is also accused of violating the FTC Act’s prohibition against unfair practices when its settings enabled, by default, real-time voice and text chat communications for children and teens. These default settings, as well as a matching system that enabled children and teens to be matched with strangers to play the game, exposed players to threats, harassment, and psychologically traumatizing issues, the FTC maintained. While company employees expressed concerns about the default settings and players reported concerns, the FTC said that the company resisted turning off the default setting and made it difficult for players to figure out how to turn the voice chat off when the FTC did eventually take action.

    Under the terms of a proposed court order filed by the DOJ, the company would be prohibited from enabling voice and text communications unless parents (of players under the age of 13) or teenage users (or their parents) provide affirmative consent through a privacy setting. The company would also be required to delete players’ information that was previously collected in violation of COPPA’s parental notice and consent requirements unless it obtains parental consent to retain such data or the player claims to be 13 or older through a neutral age gate. Additionally, the company must implement a comprehensive privacy program to address the identified violations, maintain default privacy settings, and obtain regular, independent audits. According to the DOJ’s announcement, the company has agreed to pay $275 million in civil penalties—the largest amount ever imposed for a COPPA violation.

    With respect to the illegal dark patterns allegations, the FTC claimed that the company used a variety of dark patterns, such as “counterintuitive, inconsistent, and confusing button configuration[s],” designed to get players of all ages to make unintended in-game purchases. These tactics caused players to pay hundreds of millions of dollars in unauthorized charges, the FTC said, adding that the company also charged account holders for purchases without authorization. Players were able to purchase in-game content by pressing buttons without requiring any parental or card holder action or consent. Additionally, the company allegedly blocked access to purchased content for players who disputed unauthorized charges with their credit card companies, and threatened players with a lifetime ban if they disputed any future charges. Moreover, cancellation and refund features were purposefully obscured, the FTC asserted.

    To resolve the unlawful billing practices, the proposed administrative order would require the company to pay $245 million in refunds to affected players. The company would also be prohibited from charging players using dark patterns or without obtaining their affirmative consent. Additionally, the order would bar the company from blocking players from accessing their accounts should they dispute unauthorized charges.

    Federal Issues FTC DOJ Enforcement Privacy, Cyber Risk & Data Security COPPA FTC Act Unfair UDAP Consumer Finance Dark Patterns

  • FTC takes action against ed tech provider for lax data security

    Federal Issues

    On October 31, the FTC announced an administrative action against an education technology (ed tech) provider claiming that the company’s allegedly poor data security practices exposed millions of users and employees’ sensitive information, including Social Security numbers, email addresses, and passwords. According to the FTC’s complaint, due to the company’s alleged failure to adequately protect the personal information collected from its users and employees, the company experienced four data breaches beginning in September 2017, when a phishing attack granted a hacker access to employees’ direct deposit information. Less than a year later, another data breach involved a former employee using login information the company shared with employees and outside contractors to gain access to a third-party cloud database containing personal data for roughly 40 million users. In the following two years, the company experienced two more data breaches through phishing attacks that exposed sensitive employee data, including medical and financial information. Claiming violations of Section 5(a) of the FTC Act, the Commission alleged the company failed to implement basic security measures, stored personal data insecurely, and failed to implement a written security policy until January 2021, despite experiencing three phishing attacks.

    Under the terms of the proposed decision and order, the company would be required to take several measures to address the alleged conduct, including (i) documenting and limiting data collection; (ii) providing users access to collected data and allowing them to submit requests for deletion; (iii) implementing multifactor authentication or another authentication method to protect user and employee accounts; and (iv) implementing a comprehensive information security program that would encrypt consumer data and provide security training to employees, among other things.

    This action is part of the FTC’s ongoing efforts to make sure ed tech providers protect and secure personal data they collect and do not collect more information than necessary. As previously covered by InfoBytes, the FTC issued a policy statement in May warning ed tech providers that they must fully comply with all provisions of the Children’s Online Privacy Protection Act when gathering data about children. The FTC emphasized that ed tech providers may not harvest or monetize children’s data, cannot force children to disclose more information than is reasonably necessary for participating in their educational services, and must have procedures in place to keep the data secure, among other things.

    Federal Issues Privacy, Cyber Risk & Data Security FTC Enforcement FTC Act UDAP COPPA Data Breach Consumer Protection

  • Democrats urge FTC to update COPPA

    Privacy, Cyber Risk & Data Security

    On September 29, Senator Edward J. Markey (D-MA), along with three other Congressional Democrats, sent a letter to FTC Chair Lina Khan requesting that the Commission update its regulations under the Children’s Online Privacy Protection Act (COPPA). The Senators encouraged the FTC to use its regulatory authority to update COPPA to implement additional protections addressing online threats to children as their use of technology increases. They laid out several areas for the FTC’s consideration, including (i) “expanding the definition of ‘personal information’ covered under COPPA”; (ii) “implementing rules to effectuate COPPA’s prohibition on conditioning a child’s participation in an online activity on the child sharing more data than is reasonably necessary”; (iii) “implementing rules to effectuate COPPA’s requirement that platforms protect the confidentiality, security, and integrity of children’s data”; (iv) “ensuring that COPPA’s requirements protect children on the platforms they actually use by updating COPPA’s regulations defining platforms that are directed to children and updating regulations defining platforms that have actual knowledge they are collecting data from children”; (v) “implementing regulatory protections that reflect the increased use of online platforms for educational purposes”; and “(vi) implementing regulatory protections that reflect changes in online advertising practices.”

    The Senators also applauded the FTC’s recently issued advanced notice of proposed rulemaking requesting feedback on questions related to a wide range of concerns about commercial surveillance practices (covered by InfoBytes here), including those involving children and teens, and advised the Commission to closely review and consider expert responses when crafting its rules aimed at the protection of children’s privacy.

    Privacy, Cyber Risk & Data Security Agency Rule-Making & Guidance FTC Federal Issues COPPA Consumer Protection

  • California adopts “first-in-nation” act to safeguard children’s online data and privacy

    Privacy, Cyber Risk & Data Security

    On September 15, the California governor signed into law the California Age-Appropriate Design Code Act (the Act), calling it the “first-in-nation” bill to protect children’s online data and privacy. AB 2273 establishes new legal requirements for businesses that provide online products and services that are “likely to be accessed by children” under 18 years of age based on certain factors. These factors include whether the feature is: (i) “directed to children,” as defined by the Children’s Online Privacy Protection Act (COPPA); (ii) “determined, based on competent and reliable evidence regarding audience composition, to be routinely accessed by a significant number of children”; (iii) advertised to children; (iv) is substantially similar to, or the same as, an online service, product, or feature routinely accessed by a significant number of children; (v) designed to appeal to children; or (vi) determined to be, based on internal company research, significantly accessed by children. Notably, in contrast to COPPA, the Act more broadly defines “child” as a consumer who is under the age of 18 (COPPA defines “child” as an individual under 13 years of age).

    The Act also outlines specific requirements for covered businesses, including:

    • Businesses must configure all default privacy settings offered by the online service, product, or feature to one that offers a high level of privacy, “unless the business can demonstrate a compelling reason that a different setting is in the best interests of children”;
    • Businesses must “concisely” and “prominently” provide clear privacy information, terms of service, policies, and community standards suited to the age of the children likely to access the online service, product, or feature;
    • Prior to offering any new online services, products, or features that are likely to be accessed by children before July 1, 2024, businesses must complete a Data Protection Impact Assessment (DPIA) on or before the same date. Businesses must also document any “risk of material detriment to children” that arises from the DPIA, create a mitigation plan, and, upon written request, provide the DPIA to the state attorney general;
    • Businesses must “[e]stimate the age of child users with a reasonable level of certainty appropriate to the risks that arise from the data management practices of the business or apply the privacy and data protections afforded to children to all consumers”;
    • Should an online service, product, or feature allow a child’s parent, guardian, or any other consumer to monitor the child’s online activity or track the child’s location, businesses must provide an obvious signal to the child when the child is being monitored or tracked;
    • Businesses must “[e]nforce published terms, policies and community standards established by the business, including, but not limited to, privacy policies and those concerning children”; and
    • Businesses must provide prominent, accessible, and responsive tools to help children (or their parents/guardians) exercise their privacy rights and report concerns.

    Additionally, covered businesses are prohibited from using a child’s personal information (i) in a way that the business knows, or has reason to know, is materially detrimental to a child’s physical health, mental health, or well-being; or (ii) for any reason other than a reason for which the personal information was collected, unless a business can show a compelling reason that using the personal information is in the “best interests of children.” The Act also places restrictions on profiling, collecting, selling, or sharing children’s geolocation data, or using dark patterns to encourage children to provide personal information beyond what is reasonably expected.

    The Act also establishes the California Children’s Data Protection Working Group, which will study and report to the legislature best practices for implementing the Act, and will also, among other things, evaluate ways to leverage the expertise of the California Privacy Protection Agency in the long-term development of data privacy policies that affect the privacy, rights, and safety of children online. The state attorney general is tasked with enforcing the Act and may seek an injunction or civil penalty against any business that violates its provisions. Violators may be subject to a penalty of up to $2,500 per affected child for each negligent violation, and up to $7,500 per affected child for each intentional violation; however, businesses may be provided a 90-day cure period if they have achieved “substantial compliance” with the Act’s assessment and mitigation requirements.

    The Act takes effect July 1, 2024.

    Privacy, Cyber Risk & Data Security State Issues State Legislation Consumer Protection California COPPA CPPA State Attorney General Enforcement

  • CARU orders app company to correct violations of children’s privacy rules

    Privacy, Cyber Risk & Data Security

    On September 7, the Children’s Advertising Review Unit (CARU) announced that the owner of a cartoon-themed app company has agreed to correct alleged violations of the Children’s Online Privacy Protection Act (COPPA) and CARU’s Self-Regulatory Guidelines for Advertising and for Children’s Online Privacy Protection. CARU found that the company served multiple automated ads that could not be stopped—which included interactive features that mimicked the app's gameplay—until users downloaded the advertised app or watched the entire ad. CARU found that these “ads unduly interfered with gameplay, encouraged excessive ad viewing by children through deceptive door openers and other manipulative design techniques, required children to download and install unnecessary apps, and often provided unclear and inconspicuous methods for children to exit the ad and return to the game.” CARU further noted that while its Advertising Guidelines do not require in-app ads to provide an exit method, “they specify that where one is offered it must be clear and conspicuous.” CARU also said that the app “failed to use simple, clear, and conspicuous language to let children know when they were selecting a button that would force them to watch or engage with an ad, and instead used small disclosures in tiny, inconspicuous text.” The company also displayed some ads that were unsafe and inappropriate for children in violation of CARU's Advertising Guidelines. 

    CARU noted that the company did take proactive steps to address each of CARU's concerns regarding its advertising and privacy practices. Specifically, the company will, among other things, “[u]pdate its age screening mechanism to allow users to freely enter the month and year of their birth and, use technical measures to prevent a child from entering a different age once they initially submit their age,” and “[u]pdate its privacy policy to align with COPPA and better reflect its data practices as a mixed-audience site.” In particular, the app company has already voluntarily updated its age screen to direct users to two different versions of the app, with one directed towards users under age 13 and a separate version for those age 13 and up.

    Privacy, Cyber Risk & Data Security Enforcement COPPA CARU

  • FTC cracks down on ed tech providers’ COPPA compliance

    Federal Issues

    On May 19, the FTC warned providers of education technology (ed tech) tools for children that they must fully comply with all provisions of the Children’s Online Privacy Protection Act (COPPA). The Commission voted unanimously to approve a policy statement clarifying how COPPA applies to ed tech tools that gather data about children, while underscoring prohibitions on harvesting and monetizing children’s data. The policy statement explained that ed tech providers cannot force children to disclose more information than is reasonably necessary for participating in their educational services and are prohibited from using collected data for marketing or advertising purposes. Additionally, providers are prohibited from retaining children’s data for longer than necessary to fulfill the purpose for which it was collected, and must have procedures in place to keep the data secure. The FTC noted that “even absent a breach, COPPA-covered ed tech providers violate COPPA if they lack reasonable security.” Providers that fail to comply with COPPA may face civil penalties as well as new requirements and limitations on their business practices to stop the unlawful conduct. The policy statement comes as the FTC reexamines COPPA. As previously covered by InfoBytes, the Commission launched a rule review in 2019.

    Federal Issues FTC COPPA Privacy/Cyber Risk & Data Security Ed Tech

  • Connecticut legislature passes consumer data privacy bill

    Privacy, Cyber Risk & Data Security

    Recently, the Connecticut legislature passed SB 6, which would enact provisions related to consumer data privacy and online monitoring. Highlights of the bill include:

    • Applicability. The bill will apply to a controller that conducts business in the state or produces products or services for consumer residents that, during the preceding calendar year, “controlled or processed the personal data of not less than seventy-five thousand consumers, excluding personal data controlled or processed solely for the purpose of completing a payment transaction” or “controlled or processed the personal data of not less than twenty-five thousand consumers and derived more than twenty-five per cent of their gross revenue from the sale of personal data.” Certain entities and types of data are exempt from the bill’s requirements, including state governmental entities; nonprofits; higher education institutes; national security associations registered under the Securities Exchange Act of 1934; financial institutions or data subject to federal privacy disclosure requirements; hospitals; certain types of health information subject to federal health privacy laws; consumer reporting agencies, furnishers, and consumer report users of information involving personal data bearing on a consumer’s credit; personal data regulated by certain federal regulations; and air carriers. Additionally, a controller and processor will be considered to be in compliance with the bill’s parental consent obligations provided it complies with verifiable parental consent mechanisms under the Children’s Online Privacy Protection Act.
    • Consumer rights. Under the bill, consumers will be able to, among other things, (i) confirm whether their personal data is being processed and access their data; (ii) correct inaccuracies; (iii) delete their data; (iv) obtain a copy of personal data processed by a controller; and (v) opt out of the processing of their data for targeted advertising, the sale of their data, or profiling to assist solely automated decisions. A consumer may designate another person to serve as his or her authorized agent to opt out of the processing of such consumer’s personal data.
    • Controllers’ and processors’ responsibilities. Under the bill, controllers will be responsible for responding to consumers’ requests within 45 days (an additional 45-day extension may be requested under certain circumstances). Responses to consumers’ requests must be provided free of charge, unless the request is “manifestly unfounded, excessive or repetitive,” in which case a controller may charge a reasonable administrative fee or decline to act on the request (a controller bears the burden of explaining the denial and must also establish an appeals process, including a method through which a consumer may submit a complaint to the state attorney general). Among other things, controllers must “[l]imit the collection of personal data to what is adequate, relevant and reasonably necessary in relation to the purposes for which such data is processed, as disclosed to the consumer” and are required to implement data security protection practices “appropriate to the volume and nature of the personal data at issue” and conduct data protection assessments for processing activities that present a heightened risk of harm to consumers. Controllers may not process personal data in violation of federal and state laws that prohibit unlawful discrimination against consumers and must provide an effective mechanism for consumers to revoke consent that is at least as easy as the method used to provide consent. Controllers must cease processing data within 15 days of receiving a revocation request. The bill also requires controllers to provide privacy notices to consumers disclosing certain information regarding data collection and sharing practices (including sharing with third parties), and if the controller sells a consumer’s personal data to third parties or engages in targeted advertising, the controller must disclose how consumers may exercise their rights under the bill. Controllers also will be prohibited from processing sensitive personal data without first presenting a consumer with the opportunity to opt out. The bill further specifies requirements for processing de-identified data or pseudonymous data. Data processors must adhere to a controller’s instructions and enter into contracts with clearly specified instructions for processing personal data.
    • Private right of action and state attorney general enforcement. The bill explicitly prohibits a private right of action. Instead, it grants the state attorney general exclusive authority to enforce the law. The attorney general may also require a controller to disclose any data protection assessments relevant to an investigation. A violation of the bill’s provisions will constitute an unfair trade practice.
    • Right to cure. Upon discovering a potential violation of the bill, the attorney general (during the period beginning July 1, 2023 through December 31, 2024) must provide a controller or processor written notice of violation. The controller or processor then has 60 days to cure the alleged violation before the attorney general can file suit. Beginning on January 1, 2025, the attorney general, when determining whether to provide a controller or processor the opportunity to cure an alleged violation, may consider the number of violations, the controller/processor’s size and complexity, the nature and extent of the processing activities, the substantial likelihood of public injury, and the safety of persons or property.

    If enacted in its current form, the bill would take effect July 1, 2023.

    Privacy/Cyber Risk & Data Security State Issues State Legislation Connecticut Consumer Protection COPPA State Attorney General Enforcement

  • Social networking apps settle minors' data claims for $1.1 million

    Privacy, Cyber Risk & Data Security

    On March 25, the U.S. District Court for the Northern District of Illinois granted final approval to a $1.1 million class action settlement resolving claims that the operators of two video social networking apps (defendants) “‘surreptitiously tracked, collected, and disclosed the personally identifiable information and/or viewing data of children under the age of 13,’ ‘without parental consent’” in violation of federal and California privacy law. Specifically, plaintiffs asserted violations of the Video Privacy Protection Act (VPPA), the California constitutional right to privacy, the California Consumers Legal Remedies Act (CLRA), and the Illinois Consumer Fraud and Deceptive Businesses Practices Act. Defendants countered that plaintiffs’ state-law claims were preempted by the Children’s Online Privacy Protection Act, and that, furthermore, the “alleged conduct is not within the scope of VPPA or the cited state consumer protection laws” and “does not amount to a common law invasion of privacy or a violation of Plaintiffs’ rights under the California Constitution.” Moreover, defendants argued that plaintiffs could not recover actual damages. According to plaintiffs’ supplemental motion for final approval, following months-long negotiations, the parties agreed to settle the action on a class-wide basis.

    The settlement requires defendants to pay $1.1 million into a non-reversionary settlement fund, to be dispersed pro rata to class members (anyone in the U.S. who, prior to the settlement’s effective date and while under the age of 13, registered for or used the apps) who submit a valid claim after the payment of settlement administration expenses, taxes, fees, and service awards. The court’s order, however, declined to award an objector’s counsel any attorneys’ fees for his efforts to negotiate modified relief because the agreement was negotiated in a separate proceeding in related multidistrict litigation. The court also denied plaintiffs’ motion for sanctions against the objector’s law firm.

    Privacy/Cyber Risk & Data Security Courts Settlement Class Action State Issues Illinois California COPPA

  • CARU orders app company to correct violations of children’s privacy rules

    Privacy, Cyber Risk & Data Security

    On March 8, the Children’s Advertising Review Unit (CARU) announced that a smart watch phone operator has agreed to take actions to correct alleged violations of the Children’s Online Privacy Protection Act (COPPA) and CARU’s Self-Regulatory Guidelines for Children’s Online Privacy Protection. According to the press release, CARU is the nation’s first FTC-approved COPPA Safe Harbor Program and is tasked with monitoring online services for compliance with COPPA and CARU’s privacy guidelines to make sure the collection of children’s data is handled responsibly. CARU examined the company’s data handling and sharing practices and found that the company, among other things, “failed to provide clear and complete, and non-confusing, notice of its children’s information collection practices in its privacy policy and failed to provide any notice that would constitute a direct notice to parents as required by COPPA.” The company also failed to offer a method for parents to provide verifiable consent to its data gathering practices prior to its collection of information from children, CARU stated, adding that the company’s privacy policy, terms of service, and other online disclosures also included “inconsistent, confusing and/or contradictory statements about its collection, use, or disclosure of children's personal information.”

    CARU noted that the company submitted a “detailed plan” outlining measures to remedy the concerns and agreed to correct the violations in order to comply with CARU’s privacy guidelines and COPPA. The company will also update its privacy policy to include information on how parents can prohibit the use of their child’s data or have it deleted and will obtain verifiable parental consent prior to completing the registration process. CARU also recommended that the company revise its website and app to provide parents with “direct notice of what personal information the operator can collect from children through their use of the service, both passively and actively, and how such personal information can be used and disclosed, together with a clear and prominent link to its privacy policy.”

    Privacy/Cyber Risk & Data Security Enforcement COPPA CARU FTC

  • FTC, DOJ reach $1.5 million settlement with weight-loss companies

    Federal Issues

    On March 4, the FTC and DOJ announced a $1.5 million settlement with an international weight loss service organization and its subsidiary (collectively, “defendants”) accused of allegedly using unfair and deceptive practices to obtain personal information of underage users without parental consent. As previously covered by InfoBytes, the agencies claimed that the defendants violated the Children’s Online Privacy Protection Act (COPPA) and Section 5 of the FTC Act by collecting and keeping personal information from children under 13 without providing notice to or obtaining consent from their parents. The agencies’ settlement announcement stated that the defendants’ signup process originally “encouraged younger users to falsely claim they were over the age of 13, despite text indicating that children under 13 must sign up through a parent,” and that even after the signup process was revised, the defendants allegedly “failed to provide a mechanism to ensure that those who choose the parent signup option were indeed parents and not a child trying to bypass the age restriction.” Additionally, the defendants allegedly violated COPPA’s data retention provisions “by retaining children’s personal information indefinitely and only deleting it when requested by a parent.”

    Under the terms of the settlement, unless verified parent consent has been subsequently obtained, the defendants are required to refrain from disclosing, using, or benefiting from previously collected personal information that did not comply with COPPA’s parental notice and consent requirements, and must destroy all previously collected personal information, as well as any affected work product that used illegally collected data. The settlement also orders the defendants to pay a $1.5 million civil penalty.

    Federal Issues FTC Enforcement DOJ Privacy/Cyber Risk & Data Security COPPA FTC Act

Pages

Upcoming Events