Subscribe to our InfoBytes Blog weekly newsletter and other publications for news affecting the financial services industry.
Under the terms of the proposed court order filed June 27 stipulating to an injunction, monetary judgement, and other relief, the defendant would be required to pay $18.5 million in monetary relief and make numerous changes to its email and internet operations. Among other things, the defendant would be required to clearly and conspicuously disclose on every shopping page that a purchase is not required to enter a sweepstakes and that purchasing will not help a consumer win. Consumers would also be required, in many cases, to acknowledge this disclosure when responding to a call to action that results in an order. The defendant must also clearly disclose material costs and terms of purchase, as well as any additional fees, and cancellation and return policies. Additionally, the defendant would be required to delete all consumer data collected prior to January 1, 2019, unless required for processing transactions, and stop misrepresenting its data collection and sharing practices.
On April 13, the U.S. District Court for the Northern District of Illinois denied a credit reporting agency’s (CRA) bid to stay litigation filed by the CFPB alleging deceptive practices related to the marketing and sale of credit scores, credit reports, and credit-monitoring products to consumers. The Bureau sued the CRA and one of its former senior executives last April (covered by InfoBytes here), claiming the defendants allegedly violated a 2017 consent order by continuing to engage in “digital dark patterns” that caused consumers seeking free credit scores to unknowingly sign up for a credit monitoring service with recurring monthly charges.
The CRA requested a stay while the U.S. Supreme Court considers whether the Bureau’s funding mechanism is unconstitutional. Earlier this year, the Court agreed to review next term the 5th Circuit’s decision in Community Financial Services Association of America v. Consumer Financial Protection Bureau, where it found that the CFPB’s “perpetual self-directed, double-insulated funding structure” violated the Constitution’s Appropriations Clause. (Covered by InfoBytes here and a firm article here.) While acknowledging that a ruling against the Bureau may result in the dismissal of the action against the CRA, the court concurred with the Bureau that consumers may be exposed to harm during a stay. “Were I to grant the requested stay, it could last more than one year, depending on when the Supreme Court issues its opinion,” the court wrote. “In that time, if the Bureau’s allegations bear out, consumers will continue to suffer harm because of defendants’ unlawful conduct. That potential cost is too great to outweigh the resource preserving benefits a stay would confer.”
On March 14, the FTC finalized an administrative order requiring a video game developer to pay $245 million in refunds to consumers allegedly tricked into making unwanted in-game purchases. As previously covered by InfoBytes, the FTC filed an administrative complaint claiming players were able to accumulate unauthorized charges without parental or card holder action or consent. The FTC alleged that the company used a variety of dark patterns, such as “counterintuitive, inconsistent, and confusing button configuration[s],” designed to get players of all ages to make unintended in-game purchases. These tactics caused players to pay hundreds of millions of dollars in unauthorized charges, the FTC said, adding that the company also charged account holders for purchases without authorization. Under the terms of the final decision and order, the company is required to pay $245 million in refunds to affected card holders. The company is also prohibited from charging players using dark patterns or without obtaining their affirmative consent. Additionally, the company is barred from blocking players from accessing their accounts should they dispute unauthorized charges.
Separately, last month the U.S. District Court for the Eastern District of North Carolina entered a stipulated order against the company related to alleged violations of the Children’s Online Privacy Protection Act (COPPA). The FTC claimed the company failed to protect underage players’ privacy and collected personal information without first notifying parents or obtaining parents’ verifiable consent. Under the terms of the order, the company is required to ensure parents receive direct notice of its practices with regard to the collection, use or disclosure of players’ personal information, and must delete information previously collected in violation of COPPA’s parental notice and consent requirements unless it obtains parental consent to retain such data or the player claims to be 13 or older through a neutral age gate. Additionally, the company is required to implement a comprehensive privacy program to address the identified violations, maintain default privacy settings, obtain regular, independent audits, and pay a $275 million civil penalty (the largest amount ever imposed for a COPPA violation).
On February 1, the DOJ filed a complaint on behalf of the FTC against a telehealth and prescription drug discount provider for allegedly violating the FTC Act and the Health Breach Notification Rule by failing to notify consumers that it was disclosing their personal health information to third parties for advertising purposes. As a vendor of personal health records, the FTC stated that the company is required to comply with the Health Breach Notification Rule, which imposes certain reporting obligations on health apps and other companies that collect or use consumers’ health information (previously covered by InfoBytes here).
According to the complaint filed in the U.S. District Court for the Northern District of California, the company—which allows users to keep track of their personal health information, including saving, tracking, and receiving prescription alerts—shared sensitive personal health information with advertisers and other third parties for years, even though it allegedly promised users that their health information would never be shared. The FTC maintained that the company also monetized users’ personal health information and used certain shared data to target its own users with personalized health- and medication-specific advertisement on various social media platforms. The company also allegedly: (i) permitted third parties to use shared data for their own internal purposes; (ii) falsely claimed compliance with the Digital Advertising Alliance principles (which requires companies to obtain consent prior to using health information for advertising purposes); (iii) misrepresented its HIPAA compliance; (iv) failed to maintain sufficient formal, written, or standard privacy or data sharing policies or procedures to protect personal health information; and (v) failed to report the unauthorized disclosures.
Under the terms of the proposed court order filed by the DOJ, the company would be required to pay a $1.5 million civil penalty, and would be prohibited from engaging in the identified alleged deceptive practices and from sharing personal health information with third parties for advertising purposes. The company would also be required to implement several measures to address the identified violations, including obtaining users’ affirmative consent before disclosing information to third parties (the company would be prohibited from using “dark patterns,” or manipulative designs, to obtain consent), directing third parties to delete shared data, notifying users about the breaches and the FTC’s enforcement action, implementing a data retention schedule, and putting in place a comprehensive privacy program to safeguard consumer data.
On January 19, the CFPB released Circular 2023-01 to reiterate that companies offering “negative option” subscription services are required to comply with federal consumer financial protection laws. According to the Circular, “‘negative option’ [marketing] refers to a term or condition under which a seller may interpret a consumer’s silence, failure to take an affirmative action to reject a product or service, or failure to cancel an agreement as acceptance or continued acceptance of the offer.” The Bureau clarified that negative option marketing practices could violate the CFPA where a seller: (i) misrepresents or fails to clearly and conspicuously disclose the material terms of a negative option program; (ii) fails to obtain consumers’ informed consent; or (iii) misleads consumers who want to cancel, erects unreasonable barriers to cancellation, or fails to honor cancellation requests that comply with its promised cancellation procedures.
The Bureau described receiving consumer complaints from older consumers about being repeatedly charged for services they did not intend to buy or no longer wanted to continue purchasing. Other consumers reported being enrolled in subscriptions without knowledge of the program or the costs. Consumers also submitted complaints regarding the difficulty of cancelling subscription-based services and about charges on their credit card or bank account after they requested cancellation.
The Bureau also warned that negative option programs can be particularly harmful when paired with dark patterns. The Circular noted that the Bureau and the FTC have taken action to combat the rise of digital dark patterns, which can be used to deceive, steer, or manipulate users into behavior that is profitable for a company, but often harmful to users or contrary to their intent. The Bureau noted that consumers could be misled into purchasing subscriptions and other services with recurring charges and be unable to cancel the unwanted products and services or avoid their charges.
On December 19, the DOJ filed a complaint on behalf of the FTC against a video game developer for allegedly violating the Children’s Online Privacy Protection Act (COPPA) by failing to protect underage players’ privacy. The FTC also alleged in a separate administrative complaint that the company employed “dark patterns” to trick consumers into making unwanted in-game purchases, thus allowing players to accumulate unauthorized charges without parental involvement. (See also FTC press release here.)
According to the complaint filed in the U.S. District Court for the Eastern District of North Carolina, the company allegedly collected personal information from players under the age of 13 without first notifying parents or obtaining parents’ verifiable consent. Parents who requested that their children’s personal information be deleted allegedly had to take unreasonable measures, the FTC claimed, and the company sometimes failed to honor these requests. The company is also accused of violating the FTC Act’s prohibition against unfair practices when its settings enabled, by default, real-time voice and text chat communications for children and teens. These default settings, as well as a matching system that enabled children and teens to be matched with strangers to play the game, exposed players to threats, harassment, and psychologically traumatizing issues, the FTC maintained. While company employees expressed concerns about the default settings and players reported concerns, the FTC said that the company resisted turning off the default setting and made it difficult for players to figure out how to turn the voice chat off when the FTC did eventually take action.
Under the terms of a proposed court order filed by the DOJ, the company would be prohibited from enabling voice and text communications unless parents (of players under the age of 13) or teenage users (or their parents) provide affirmative consent through a privacy setting. The company would also be required to delete players’ information that was previously collected in violation of COPPA’s parental notice and consent requirements unless it obtains parental consent to retain such data or the player claims to be 13 or older through a neutral age gate. Additionally, the company must implement a comprehensive privacy program to address the identified violations, maintain default privacy settings, and obtain regular, independent audits. According to the DOJ’s announcement, the company has agreed to pay $275 million in civil penalties—the largest amount ever imposed for a COPPA violation.
With respect to the illegal dark patterns allegations, the FTC claimed that the company used a variety of dark patterns, such as “counterintuitive, inconsistent, and confusing button configuration[s],” designed to get players of all ages to make unintended in-game purchases. These tactics caused players to pay hundreds of millions of dollars in unauthorized charges, the FTC said, adding that the company also charged account holders for purchases without authorization. Players were able to purchase in-game content by pressing buttons without requiring any parental or card holder action or consent. Additionally, the company allegedly blocked access to purchased content for players who disputed unauthorized charges with their credit card companies, and threatened players with a lifetime ban if they disputed any future charges. Moreover, cancellation and refund features were purposefully obscured, the FTC asserted.
To resolve the unlawful billing practices, the proposed administrative order would require the company to pay $245 million in refunds to affected players. The company would also be prohibited from charging players using dark patterns or without obtaining their affirmative consent. Additionally, the order would bar the company from blocking players from accessing their accounts should they dispute unauthorized charges.
On November 1, the FTC held its annual PrivacyCon event, which hosted research presentations on a wide range of consumer privacy and security issues. Opening the event, FTC Chair Lina Khan stressed the importance of hearing from the academic community on topics related to a range of privacy issues that the FTC and other government bodies may miss. Khan emphasized that regulators cannot wait until new technologies fully emerge to think of ways to implement new laws for safeguarding consumers. “The FTC needs to be on top of this emerging industry now, before problematic business models have time to solidify,” Khan said, adding that the FTC is consistently working on privacy matters and is “prioritizing the use of creative ideas from academia in [its] bread-and-butter work” to craft better remedies to reflect what is actually happening. She highlighted a recent enforcement action taken against an online alcohol marketplace and its CEO for failing to take reasonable steps to prevent two major data breaches (covered by InfoBytes here). Khan noted that while the settlement’s requirements, such as imposing multi-factor authentication requirements and destroying unneeded user data, may not sound “very cutting-edge” they serve as a big step forward for government enforcers. Chief Technology Officer Stephanie Nguyen, who is responsible for leading the charge to integrate technologists across the FTC’s various lines of work, including consumer privacy, discussed the work of these technologists (including AI and security experts, software engineers, designers, and data scientists) to help develop remedies in data security-related enforcement actions and to push companies to not just do the minimum to remediate areas like unreasonable data security but to model best practices for the industry. “We want to see bad actors face real consequences,” Nguyen said, adding that the FTC wants to hold corporate leadership accountable as it did in the enforcement action Khan cited. Nguyen further stressed that there is also a need to address systemic risk by making companies delete illegally collected data and destroy any algorithms derived from the data.
The one-day conference featured several panel sessions covering a number of topics related to consumer surveillance, automated decision-making systems, children’s privacy, devices that listen to users, augmented/virtual reality, interfaces and dark patterns, and advertising technology. Topics addressed during the panels include (i) requiring data brokers to provide accurate information; (ii) understanding how data inaccuracies can disproportionately affect minorities and those living in poverty, and why relying on this data can lead to discriminatory practices; (iii) examining bias and discrimination risks when engaging in emotional artificial intelligence; (iv) understanding automated decision making systems and how the quality of these systems impact populations they are meant to represent; (v) recognizing the lack of transparency related to children’s data collection and use, and the impact various privacy laws, including the Children’s Online Privacy Protection Rule, the General Data Protection Regulation, and the California Consumer Privacy Act, have on the collection/use/sharing of personal data; (vi) recognizing challenges related to cookie-consent interfaces and dark patterns; and (vii) examining how targeted online advertising both in the U.S. and abroad affects consumers.
On November 3, the FTC announced an action against an internet phone service provider claiming the company imposed “junk fees” and made it difficult for consumers to cancel their services. The FTC alleged in its complaint that the company violated the FTC Act and the Restore Online Shoppers’ Confidence Act by imposing a series of obstacles, sometimes referred to as “dark patterns”, to deter and prevent consumers from canceling their services or stopping recurring charges. Consumers who were able to sign up for services online were allegedly forced to speak to a live “retention agent” on the phone during limited working hours in order to cancel their services. The company also allegedly employed a “panoply of hurdles” to cancelling consumers by, among other things, making it difficult for the consumer to locate the phone number on the website, obscuring contact information, failing to consistently transfer consumers to the appropriate number, imposing lengthy wait times, holding reduced operating hours for the cancellation line, and failing to provide promised callbacks. Additionally, the FTC claimed the company often informed consumers they would have to pay an early termination fee (sometimes hundreds of dollars) that was not clearly disclosed when they signed up for the services, and continued to illegally charge consumers without consent even after they requested cancellation. According to the FTC, consumers who complained often only received partial refunds.
Under the terms of the proposed stipulated order, the company will be required to take several measures, including (i) obtaining consumers’ express, informed consent to charge them for services; (ii) simplifying the cancellation process to ensure it is easy to find and use and is available through the same method the consumer used to enroll; (iii) ending the use of dark patterns to impede consumers’ cancellation efforts; and (iv) being transparent about the terms of any negative option subscription plans, including providing required disclosures as well as a simple mechanism for consumers to cancel the feature. The company will also be required to pay $100 million in monetary relief.
On October 18, the CFPB filed a complaint against a Texas-based payment processing service platform (primarily related to collecting and processing event fees) for allegedly violating the Consumer Financial Protection Act (CFPA) and the EFTA by engaging in deceptive and abusive acts and practices. The Bureau alleged that the defendant enrolled consumers in, and charged them, for discount club memberships without their consent that were largely unrelated to the event the consumers were signing up for. The complaint noted that although the defendant’s memberships had a 30-day free “negative option trial membership,” the memberships automatically begin charging the membership fees at the end of the trial period. The Bureau also alleged that the defendant deployed dark patterns, which “are hidden tricks or trapdoors that companies build into their websites to get consumers to inadvertently click links, sign up for subscriptions, or purchase products or services.” The Bureau further alleged that the defendant violated the EFTA and Regulation E by increasing consumers’ membership fees without sending the consumer written notice of the new amount and the date of the new payment at least 10 days before initiating the new payment, which also constitute violations of the CFPA. The Bureau is seeking permanent injunctive relief, damages, restitution, disgorgement, civil money penalties, and other relief.
According to a statement by CFPB Director Rohit Chopra, the Bureau is “closely watching whether financial services firms are deploying digital dark patterns,” and is “looking at a range of ways to reduce unwanted junk fees.” He also added that the Bureau is “working to ensure our payments system is working safely and fairly” and that it “will continue to look at how payment platforms extract data and fees from their users.”
On September 15, the FTC released a report, Bringing Dark Patterns to Light, examining how “dark patterns” can effect consumer choice and decision-making and could violate the law. The report stems from an April 2021 workshop that the Commission held to explore dark patterns. According to the FTC, the dark pattern tactics detailed in the report include disguising ads to appear like independent content, which makes “it difficult for consumers to cancel subscriptions or charges, burying key terms or junk fees, and tricking consumers into sharing their data.” The report highlighted the FTC’s efforts to combat the use of dark patterns in the marketplace and reiterated the Commission’s commitment to taking action against tactics designed to trick and trap consumers. Among other things, the report noted four common dark pattern tactics, which include design elements that: (i) induce false beliefs; (ii) hide or delay disclosure of material information; (iii) lead to unauthorized charges; and (iv) obscure or subvert privacy choices. The report also cited a 2017 case brought against a company as an example of past enforcement work, in which FTC fined the company $2.2 million for enabling default settings that allowed its smart TVs to collect and share consumers’ viewing activity with third parties, providing a brief notice to some consumers that the agency said could easily be missed.