Subscribe to our InfoBytes Blog weekly newsletter and other publications for news affecting the financial services industry.
On October 7, the California governor approved SB 478 (the “Act”), enacting amendments to the Consumers Legal Remedies Act designed to prohibit “drip pricing,” which involves advertising a price that is lower than the actual price a consumer will have to pay for a good or service. The Act, with specified exceptions, will make advertising the price of a good or service excluding additional fees or charges other than taxes, unlawful. The California Legislature declared that the Act is not intended to prohibit any particular method of determining prices for goods or services, including algorithmic or dynamic pricing. Instead, it is intended to regulate how prices are advertised, displayed, and/or offered.
The Act is effective July 1, 2024.
On July 14, the FTC finalized an order against an online counseling service, requiring it to pay $7.8 million and prohibiting the sharing of consumers’ health data for advertising purposes. The FTC alleged that the respondent shared consumers’ sensitive health data with third parties despite promising to keep such information private (covered by InfoBytes here). The FTC said it will use the settlement funds to provide partial refunds to affected consumers. The order not only bans the respondent from disclosing health data for advertising and marketing purposes but also prohibits the sharing of consumers’ personal information for re-targeting. The order also stipulates that the respondent must now obtain consumers’ affirmative express consent before disclosing personal information, implement a comprehensive privacy program with certain data protection measures, instruct third parties to delete shared data, and adhere to a data retention schedule.
On June 29, the FTC announced the finalized updated version if its Endorsement Guide, to help the agency combat deceptive reviews and endorsements. The FTC says its endorsement guides, which were previously updated in 2009, advise businesses on what practices are considered unfair and deceptive in violation of the FTC Act. The FTC explained that the current, updated version takes into consideration comments solicited earlier this year, to reflect the ways advertisers now reach consumers. The revisions: (i) “articulate a new principle regarding procuring, suppressing, boosting, organizing, publishing, upvoting, downvoting, or editing consumer reviews so as to distort what consumers think of a product”; (ii) “address incentivized reviews, reviews by employees, and fake negative reviews of a competitor;” (iii) “add a definition of ‘clear and conspicuous’ and saying that a platform’s built-in disclosure tool might not be an adequate disclosure”; (iv) “[change] the definition of ‘endorsements’ to clarify the extent to which it includes fake reviews, virtual influencers, and tags in social media;” (v) “better explain the potential liability of advertisers, endorsers, and intermediaries”; and (vi) highlight that child-directed advertising is of special concern. The FTC concurrently issued an updated version of its guidance regarding frequently asked questions about its endorsement guides.
Covered entities in California are reminded that Section 1770 of the Consumer Legal Remedies Act requires persons offering or providing a consumer financial service or product to include certain language when making solicitations. As previously covered by InfoBytes, AB 1904 was enacted last year to amend Section 1770 of the Civil Code relating to unfair methods of competition and unfair or deceptive acts. The amended code prohibits a covered person or a service provider from engaging in unlawful, unfair, deceptive, or abusive acts or practices regarding a consumer financial product or service, such as: (i) misrepresenting the source, sponsorship, approval, or certification; (ii) advertising goods or services with the intent not to sell them as advertised; and (iii) making false or misleading statements of fact concerning reasons for, the existence of, or amounts of, price reductions. The amendments authorize the California Department of Financial Protection and Innovation to bring a civil action for a violation of the law, and make unlawful the failure to include certain information, including a prescribed disclosure, in a solicitation by a covered person, or an entity acting on behalf of a covered person, to a consumer for a consumer financial product or service. Specifically, Cal. Civ. Code § 1770(a)(28) requires covered persons to include the following language in solicitations:
- “The name of the covered person, and, if applicable, the entity acting on behalf of the covered person, and relevant contact information, including a mailing address and telephone number.”
- “The following disclosure statement in at least 18-point bold type and in the language in which the solicitation is drafted: ‘THIS IS AN ADVERTISEMENT. YOU ARE NOT REQUIRED TO MAKE ANY PAYMENT OR TAKE ANY OTHER ACTION IN RESPONSE TO THIS OFFER.’”
The requirements took effect at the beginning of the year.
On March 2, the FTC filed a complaint against an online counseling service alleging the respondent violated the FTC Act by monetizing consumers’ sensitive health data for targeted advertising purposes. As part of the process to sign up for the respondent’s counseling services, consumers are required to provide sensitive mental health information, as well as other personal information. Consumers are promised that their personal health data will not be used or disclosed except for limited purposes, such as for counseling services. However, the FTC claimed the respondent used and revealed consumers’ sensitive health data to third parties for advertising purposes. According to the FTC, the respondent failed to maintain sufficient policies or procedures to protect the sensitive information and did not obtain consumers’ affirmative express consent before disclosing the health data. The respondent also allegedly failed to limit how third parties could use the health data and denied reports that it revealed consumers’ sensitive information.
Under the terms of the proposed consent order, the respondent will be required to pay $7.8 million in partial refunds to affected users and will be banned from disclosing health information to certain third parties for re-targeting advertising purposes. This will be the first FTC action returning funds to consumers whose health data was compromised. The respondent will also be prohibited from misrepresenting its sharing practices and must also (i) obtain users’ affirmative express consent before disclosing personal information to certain third parties for any purpose; (ii) implement a comprehensive privacy program with strong safeguards to protect users’ data; (iii) instruct third parties to delete shared personal data; and (iv) implement a data retention schedule imposing limits on how long personal data can be retained.
On February 1, the DOJ filed a complaint on behalf of the FTC against a telehealth and prescription drug discount provider for allegedly violating the FTC Act and the Health Breach Notification Rule by failing to notify consumers that it was disclosing their personal health information to third parties for advertising purposes. As a vendor of personal health records, the FTC stated that the company is required to comply with the Health Breach Notification Rule, which imposes certain reporting obligations on health apps and other companies that collect or use consumers’ health information (previously covered by InfoBytes here).
According to the complaint filed in the U.S. District Court for the Northern District of California, the company—which allows users to keep track of their personal health information, including saving, tracking, and receiving prescription alerts—shared sensitive personal health information with advertisers and other third parties for years, even though it allegedly promised users that their health information would never be shared. The FTC maintained that the company also monetized users’ personal health information and used certain shared data to target its own users with personalized health- and medication-specific advertisement on various social media platforms. The company also allegedly: (i) permitted third parties to use shared data for their own internal purposes; (ii) falsely claimed compliance with the Digital Advertising Alliance principles (which requires companies to obtain consent prior to using health information for advertising purposes); (iii) misrepresented its HIPAA compliance; (iv) failed to maintain sufficient formal, written, or standard privacy or data sharing policies or procedures to protect personal health information; and (v) failed to report the unauthorized disclosures.
Under the terms of the proposed court order filed by the DOJ, the company would be required to pay a $1.5 million civil penalty, and would be prohibited from engaging in the identified alleged deceptive practices and from sharing personal health information with third parties for advertising purposes. The company would also be required to implement several measures to address the identified violations, including obtaining users’ affirmative consent before disclosing information to third parties (the company would be prohibited from using “dark patterns,” or manipulative designs, to obtain consent), directing third parties to delete shared data, notifying users about the breaches and the FTC’s enforcement action, implementing a data retention schedule, and putting in place a comprehensive privacy program to safeguard consumer data.
On January 19, the FTC announced an action against an Ohio-based eye surgery provider (respondent) concerning allegations that it engaged in “bait-and-switch” advertising. According to the FTC’s complaint, the respondent engaged in deceptive business practices by marketing eye surgery for $250, yet only 6.5 percent of patients who received consultations qualified for that price. According to the FTC, despite the advertising claims, for consumers with less than near-normal vision the company typically quoted a price between $1,800 and $2,295 per eye. The FTC also alleged that respondent neglected to tell consumers up-front that the promotional price was per-eye.
Under the terms of the decision and order (which was granted final approval on March 15) the respondent must, among other things, pay $1.25 million in redress to harmed customers. Additionally, the respondent is banned from using deceptive business practices and is required to make certain clear and conspicuous disclosures when advertising the surgery at a price or discount for which most consumers would not qualify. Specifically, such disclosures must include whether the price is per eye, the price most consumers pay per eye, and any requirements or qualifications needed to get the offered price or discount.
The Commission voted to issue the administrative complaint and accepted the consent agreement 3-1. Commissioner Christine S. Wilson issued a dissenting statement, arguing that there are “no clear rules” regarding the qualifications of eye surgery referenced in the complaint. She stated that she is “concerned that requiring the inclusion of specific medical parameters in advertisements, when those parameters could be either over- or under-inclusive depending upon the results of the consultation, could be more confusing than helpful.”
On January 9, the DOJ informed a New York federal judge that it had reached a follow-up agreement with a global social media company to ensure its compliance with a June 2022 settlement that required the company to stop using a tool that allowed advertisers to exclude certain users from seeing housing ads based on their sex and estimated race/ethnicity. Explaining that the tool violated the Fair Housing Act, the letter said the company agreed to allow the tool to expire and agreed to build a system to reduce variances in its housing ad delivery system related to sex and estimated race/ethnicity. A follow-up agreement reached between the parties on compliance targets established that the company will be subject to court oversight and regular compliance review through June 27, 2026. The company released a statement following the settlement announcing it is making changes “in part to address feedback we’ve heard from civil rights groups, policymakers and regulators about how our ad system delivers certain categories of personalized ads, especially when it comes to fairness.” The company further noted that “while HUD raised concerns about personalized housing ads specifically, we also plan to use this method for ads related to employment and credit. Discrimination in housing, employment and credit is a deep-rooted problem with a long history in the US, and we are committed to broadening opportunities for marginalized communities in these spaces and others.”
On September 15, the FTC released a report, Bringing Dark Patterns to Light, examining how “dark patterns” can effect consumer choice and decision-making and could violate the law. The report stems from an April 2021 workshop that the Commission held to explore dark patterns. According to the FTC, the dark pattern tactics detailed in the report include disguising ads to appear like independent content, which makes “it difficult for consumers to cancel subscriptions or charges, burying key terms or junk fees, and tricking consumers into sharing their data.” The report highlighted the FTC’s efforts to combat the use of dark patterns in the marketplace and reiterated the Commission’s commitment to taking action against tactics designed to trick and trap consumers. Among other things, the report noted four common dark pattern tactics, which include design elements that: (i) induce false beliefs; (ii) hide or delay disclosure of material information; (iii) lead to unauthorized charges; and (iv) obscure or subvert privacy choices. The report also cited a 2017 case brought against a company as an example of past enforcement work, in which FTC fined the company $2.2 million for enabling default settings that allowed its smart TVs to collect and share consumers’ viewing activity with third parties, providing a brief notice to some consumers that the agency said could easily be missed.
On September 8, the FTC hosted a forum regarding its Advance Notice of Proposed Rulemaking (ANPR) on commercial surveillance and data security practices. As previously covered by InfoBytes, the ANPR was issued in August to solicit public comment on “the harms stemming from commercial surveillance and whether new rules are needed to protect people’s privacy and information.” The ANPR noted that there is increasing evidence that some surveillance-based services may be addictive to children and lead to a wide variety of mental health and social harms. The forum featured remarks by FTC Chair Lina M. Khan, Commissioners Rebecca Kelly Slaughter and Alvaro Bedoya, as well as a staff presentation, two panel discussions, and comments from the public. Chair Khan noted in her remarks that the discussion and comments at the forum will be critical in determining the evidentiary basis for proceeding with a rulemaking and whether legal requirements needed for crafting any particular type of rule. However, some observers expressed concern that the FTC’s ANPR could undermine efforts to pass federal privacy legislation. Slaughter noted in her remarks that she “support[s] strong federal privacy legislation, but until there’s a law on the books, the commission has a duty to use all the tools we have to investigate and address unlawful behavior in the market.” Commissioners Slaughter and Bedoya also expressed the need for public engagement to understand commercial surveillance.
The first panel focused on industry perspectives on commercial surveillance and data security. When asked about some of the best practices or potential business models developed by businesses to mitigate consumer harm and protect data, a panelist noted that there are many approaches underway, but the guiding principle is that the process of documentation supports transparency by prompting processes and critical thinking of each step in the mission learning lifecycle. One panelist expressed concerns about businesses tracking personal data, stating that because retailers collect information about their customers when they make purchases online and may recommend related offerings, regulators “should not interfere with these direct relationships.” Another panelist warned against treating all data collection and processes equally, stressing that the FTC should use its enforcement tools against third parties.
The second panel featured consumer advocates discussing interests, concerns, risks, and harms related to commercial surveillance, in addition to mitigating consumer harms and protecting data. The advocates noted, among other things, that the FTC should impose heightened safeguards on sensitive data, such as precise location records and information associated with children. Additionally, the panelists advocated for establishing a regulation and broadening the FTC’s Section 5 unfairness authority that limits widescale tracking. Specifically, one panelist discussed how the FTC should approach a data minimization rule under Section 5, recommending that such a rule should ban secondary use and third-party disclosures. In regard to combating discrimination through data collection and advertising, a panelist noted that shifting data protection responsibilities from individuals onto companies could play an important part to ensure that data-driven algorithms that deliver ads or content are not discriminating against consumers.