Subscribe to our InfoBytes Blog weekly newsletter and other publications for news affecting the financial services industry.
On February 2, the U.S. District Court for the District of New Mexico granted a technology company’s motion for reconsideration in part, but denied dismissal of the New Mexico attorney general’s action alleging the company designed and marketed mobile gaming applications (apps) targeted towards children that contain illegal tracking software in violation of the Children’s Online Privacy Protection Act (COPPA). As previously covered by InfoBytes, the attorney general filed a lawsuit against a group of technology companies, alleging that the companies’ data collection and sharing practices did not comply with COPPA’s specific notice and consent requirements, while the apps’ embedded software development kits (SDKs) allow the apps to communicate directly with the advertising companies that analyze, store, use, share, and sell the data to other third-parties to build “increasingly-detailed profiles of child users” in order to send highly-targeted advertising. In April 2020, the court denied in part a motion to dismiss by one of the companies, concluding the attorney general plausibly alleged that the company “had actual knowledge of the child-directed nature” of the apps, and under COPPA, “ad networks may be held liable for the collection of personal information from child app users only if they have ‘actual knowledge’ that the apps in which their (SDKs) are embedded are ‘directed to children.’” The company moved for reconsideration, arguing that the court improperly held whether “children were the ‘primary target audience’ of the app was not relevant to the ‘actual knowledge’ determination.”
Upon reconsideration, the court agreed with the company that its April 2020 opinion “misapprehended the significance of the mixed-audience exception to the actual knowledge determination,” but concluded that there is no basis to dismiss the COPPA claim because the attorney general still “adequately alleged actual knowledge on the part of [the company].”
On June 4, the FTC announced that a children’s mobile application developer agreed to pay $150,000 and to delete the personal information it allegedly unlawfully collected from children under the age of 13 to resolve allegations that the developer violated the Children’s Online Privacy Protection Act Rule (COPPA Rule). According to the complaint filed in the U.S. District Court for the Northern District of California, the developer, without notifying parents or obtaining verifiable parental consent, allowed third-party advertising networks to use persistent identifiers to track users of the child-directed apps in order to send targeted advertisements to the children. The proposed settlement requires the developer to destroy any personal data collected from children under 13 and notify and obtain verifiable consent from parents for any child-directed app or website they offer that collects personal information from children under 13. A $4 million penalty is suspended upon the payment of $150,000 due to the developer’s inability to pay.
In dissent, Commissioner Phillips argued that the fine imposed against the developer was too high, noting that having children view advertisements based on the collection of persistent identifiers “is something; but it is not everything,” under COPPA. Commissioner Phillips argued that because the developer did not “share sensitive personal information about children, or publicize it” nor did the developer expose children “to unauthorized contact from strangers, or otherwise put [the children] in danger,” the assessed penalty was too large in comparison to the harm.
In response to the dissent, Chairman Simons argued that while “harm is an important factor to consider…[the FTC’s] first priority is to use  penalties to deter  practices. Even in the absence of demonstrable money harm, Congress has said that these law violations merit the imposition of civil penalties.”
On February 25, the FTC released its annual report highlighting the agency’s privacy and data security work in 2019. Among other items, the report highlights consumer-related enforcement activities in 2018, including:
- A $5 billion penalty—the largest consumer privacy penalty to date—against a global social media company to resolve allegations that the company violated its 2012 FTC privacy order and mishandled users’ personal information. (Covered by InfoBytes here.)
- A $170 million penalty against a global online search engine and its video-sharing subsidiary to resolve alleged violations of the Children’s Online Privacy Protection Act (COPPA). (Covered by InfoBytes here.)
- A proposed settlement in the FTC’s first case against developers of “stalking” apps that monitor consumers’ mobile devices and allegedly compromise consumer privacy in violation of the FTC’s Act prohibition against unfair and deceptive practices and COPPA.
- A global settlement of up to $700 million issued in conjunction with the CFPB, 48 states, the District of Columbia and Puerto Rico, to resolve federal and state investigations into a 2017 data breach that reportedly compromised sensitive information for approximately 147 million consumers. (Covered by InfoBytes here.)
The report also discusses the FTC’s enforcement of the EU-U.S. Privacy Shield framework, provides links to FTC congressional testimony on privacy and data security, and offers a list of relevant rulemaking, including rules currently under review. In addition, the report highlights recent privacy-related events, including (i) an FTC hearing examining consumer privacy as part of its Hearings on Competition and Consumer Protection in the 21st Century; (ii) the fourth annual PrivacyCon event, which hosted research presentations on consumer privacy and security issues (covered by InfoBytes here); (iii) a workshop examining possible updates to COPPA; and (iv) a public workshop that examined issues affecting consumer reporting accuracy.
On December 9, a coalition of 25 state attorneys general responded to the FTC’s request for comments on a wide range of issues related to the Children’s Online Privacy Protection Rule (COPPA). As previously covered by InfoBytes, the FTC released a notice in July seeking comments on all major provisions of COPPA, including definitions, notice and parental consent requirements, exceptions to verifiable parental consent, and the safe harbor provision. In response the AGs strongly recommend that, while the FTC should “significantly” strengthen COPPA, any changes must be flexible and evolve to meet a rapidly-changing data landscape’s needs. Specifically, the AGs state that COPPA’s definition of “web site or online service directed to children,” as well as its definition of an “operator,” need to be modified, as many first-party platforms embed third parties who allegedly engage in the majority of the privacy-invasive online tracking. By expanding the definition of an operator, the AGs claim that COPPA would require compliance by companies that use and profit from the data as well as companies that collect the data. According to the AGs, COPPA, places a lower burden on third-parties and requires them to be bound by the rule only when they have “actual knowledge” that they are tracking children, even though these entities “are arguably as well-positioned as the operators of the websites and online services to know that they are tracking and monitoring children.”
The AGs also believe that the prong that “recognizes the child-directed nature of the content” should be strengthened, because companies that are able to identify and target consumers through sophisticated algorithms are often disincentivized to use the information to affirmatively identify child-directed websites or other online services. Among other things, the AGs also discuss the need for specifying the appropriate methods used for determining a user’s age, expanding COPPA to protect minors’ biometric data, and providing illustrative security requirements.
On September 4, the FTC and the New York Attorney General announced (see here and here) a combined $170 million proposed settlement with the world’s largest online search engine and its video-sharing site subsidiary concerning alleged violations of the Children’s Online Privacy Protection Act (COPPA). According to the complaint, the video-sharing site allegedly collected personal information in the form of “persistent identifiers” from viewers of child-directed channels without first obtaining verifiable parental consent. The persistent identifiers allegedly generated millions of dollars in revenue by delivering targeted ads to viewers. The FTC and New York AG allege, among other things, that the defendants knew the video-sharing site hosted numerous child-directed channels but told advertisers that the video-sharing site contains general audience content, even informing one advertising company that it did not have users younger than 13 on its platform and therefore channels on its platform did not need to comply with COPPA.
Under COPPA, operators of websites and online services directed at children are prohibited from collecting personal information of children under the age of 13—including through the use of persistent identifiers for targeted advertising purposes—unless the company has explicit parental consent. Furthermore, third parties—such as advertising networks—must also comply with COPPA where they have actual knowledge that personal information is being collected directly from users of child-directed websites and online services.
While neither admitting nor denying the allegations, except as specifically stated within the settlement, the defendants will, among other things, (i) pay a $136 million penalty to the FTC and a $34 million penalty to New York; (ii) change their business practices to comply with COPPA; (iii) maintain a system for channel owners to designate their child-directed content on the video-sharing site; and (iv) disclose their data collection practices and obtain verifiable parental consent prior to collecting personal information from children. According to the FTC, the $136 million penalty is “by far the largest amount the FTC has ever obtained in a COPPA case since Congress enacted the law in 1998.”
On July 17, the FTC released a notice seeking comment on a wide range of issues related to the Children’s Online Privacy Protection Rule (COPPA Rule). The FTC last amended COPPA in 2013, and while the FTC usually reviews its rules every 10 years, the FTC notes that “[r]apid changes in technology, including the expanded use of education technology, reinforce the need to re-examine the COPPA Rule at this time.” The notice seeks comment on all major provisions of the COPPA Rule, including definitions, notice and parental consent requirements, exceptions to verifiable parental consent, and the safe harbor provision. Additionally, the notice seeks responses to specific questions, including (i) has the Rule affected the availability of websites or online services directed to children?; (ii) does the Rule correctly articulate the factors to consider in determining whether a website or online service is directed to children, or should additional factors be considered?; and (iii) what are the implications for COPPA enforcement raised by technologies such as interactive television, interactive gaming, or other similar interactive media? Comments must be received within 90 days after publication in the Federal Register.
On June 27, the FTC held its fourth annual PrivacyCon, which hosted research presentations on a wide range of consumer privacy and security issues. Following opening remarks by FTC Chairman Joseph Simons, the one-day conference featured four plenary sessions covering a number of hot topics:
- Session 1: Privacy Policies, Disclosures, and Permissions. Five presenters discussed various aspects of privacy policies and notices to consumers. The panel discussed current trends showing that privacy notices to consumers have generally become lengthier in recent years, which helps cover the information regulators require, but often results in information overload for consumers more generally. One presenter advocated the concept of a condensed “nutrition label” for privacy, but acknowledged the challenge of distilling complicated activities into short bullets.
- Session 2: Consumer Preferences, Expectations, and Behaviors. This panel addressed research concerning consumer expectations and behaviors with regard to privacy. Among other anecdotal information, the presenters noted that many consumers are aware that personal data is tracked, but consumers are generally unaware of what data collectors ultimately do with the personal data once collected. To that end, one presenter advocated prescriptive limits on data collection in general, which would take the onus off consumers to protect themselves. Separately, with regard to the Children’s Online Privacy Protection Act (COPPA), one presenter noted that the law generally aligns with parents’ privacy expectations, but the implementing regulations and guidelines are too broad and leave too much room for implementation variations.
- Session 3: Tracking and Online Advertising. In the third session, five presenters covered various topics, including privacy implications of free versus paid-for applications to the impact of the EU’s General Data Protection Regulation (GDPR). According to the presenters, current research suggests that the measurable privacy benefits of paying for an app are “tenuous at best,” and consumers cannot be expected to make informed decisions because the necessary privacy information is not always available in the purchase program on a mobile device such as a phone. As for GDPR, the panel agreed that there are notable reductions in web use, with page views falling 9.7 percent in one study, although it is not clear whether such reduction is directly correlated to the May 25, 2018 effective date for enforcement of GDPR.
- Session 4: Vulnerabilities, Leaks, and Breach Notifications. In the final presentation, presenters discussed new research on how companies can mitigate data security vulnerabilities and improve remediation. One presenter discussed the need for proactive identification of vulnerabilities, noting that the goal should be to patch the real vulnerabilities and limit efforts related to vulnerabilities that are unlikely to be exploited. Another presenter analyzed data breach notifications to consumers, noting that all 50 states have data breach notification laws, but there is no consensus as to best practices related to the content or timing of notifications to consumers. The presenter concluded with recommendations for future notification regulations: (i) incorporate readability testing based on standardized methods; (ii) provide concrete guidelines of when customers need to be notified, what content needs to be included, and how the information should be presented; (iii) include visuals to highlight key information; and (iv) leverage the influence of templates, such as the model privacy form for the Gramm-Leach-Bliley Act.
On April 24, the FTC announced separate settlements with the operators of an online rewards website and a dress-up games website to resolve allegations concerning poorly implemented data security measures and Children’s Online Privacy Protection Act (COPPA) violations. According to the FTC, the online rewards website operator collected personal information (PII) from users who participated in their online offerings and made promises that their account information was secure. However, the operator allegedly failed to implement data security measures or utilize encryption techniques, which granted hackers access to the network. In addition, the operator allegedly maintained PII in clear unencrypted text. As a result of the breach, hackers published and offered for sale PII for approximately 2.7 million consumers. Under the terms of the decision and order, the operator is, among other things, prohibited from misrepresenting the measures taken to protect consumers’ PII and is required to implement a comprehensive information security program for future collections of PII.
On the same day, the FTC reached a proposed settlement with a dress-up games website and its operators, who allegedly violated COPPA by failing to obtain parental consent before collecting personal information from children under 13 or provide reasonable and appropriate security for the collected data. According to the FTC, data security failures allowed hackers access to the company’s network, which stored information for roughly 245,000 users under age 13. As part of the proposed settlement filed in the U.S. District Court for the Northern District of California, the company and operators, among other things, (i) have agreed to pay $35,000 in civil penalties; (ii) will change their business practices to comply with COPPA; and (iii) are prohibited from selling, sharing, or collecting personal information until a comprehensive data security program is implemented and undergoes independent biennial assessments.
On February 27, the FTC announced a $5.7 million settlement with the operators of a video social networking app concerning alleged violations of the Children’s Online Privacy Protection Act (COPPA). Among other things, the FTC claims the operators failed to provide parents notice of its information collection practices, illegally collected personal information from children under the age of 13 without first obtaining verifiable parental consent, failed to delete personal information when parents requested, and retained information “longer than reasonably necessary to fulfill the purpose for which the information was collected.” Under COPPA, operators of websites and online services directed at children are prohibited from collecting personal information of children under the age of 13, unless the company has explicit parental consent. The FTC alleges that the operators knew a “significant percentage” of its users were under 13 and received thousands of complaints from parents that their children under 13 had created accounts on the app. While neither admitting nor denying the allegations, the operators have agreed to the monetary penalty, will change their business practices to comply with COPPA, and will remove all videos made by children younger than 13. According to the FTC, this settlement is the largest civil penalty obtained to date by the agency for COPPA violations.
New York Attorney General reaches largest ever COPPA settlement to resolve violations of children’s privacy
On December 4, the New York Attorney General announced the largest Children’s Online Privacy Protection Act (COPPA) settlement in U.S. history—totaling approximately $6 million —to resolve allegations with a subsidiary of a telecommunications company that allegedly conducted billions of auctions for ad space on hundreds of websites it knew were directed to children under the age of 13. According to the Attorney General’s office, the subsidiary collected and disclosed personal data on children through auctions for ad space, allowing advertisers to track and serve targeted ads to children without parental consent. Under COPPA, operators of websites and other online services are prohibited from collecting or sharing the information of children under the age of 13 unless they give notice and have express parental consent. Among other things, the subsidiary also allegedly placed ads on other exchanges that possessed the capability to auction ad space on child-directed websites, but that when it won ad space on COPPA-covered websites, the subsidiary treated the space as it would any other and collected user information to serve targeted ads.
Under the terms of the settlement, the subsidiary must (i) create a comprehensive COPPA compliance program, which requires annual COPPA training for staff, regular compliance monitoring, and the retention of service providers that can comply with COPPA, as well as a third party who will assess the privacy controls; (ii) enable website operators that sell ad inventory to indicate what portion of a website is subject to COPPA; and (iii) destroy the personal data it collected on children.
- Jonice Gray Tucker to moderate “Pandemic relief response and lasting impacts on access, credit, banking, and equality” at the American Bar Association Business Law Section Spring Meeting
- Jeffrey P. Naimon to discuss "Post-pandemic CFPB exam preparation" at the Mortgage Bankers Association Spring Conference & Expo
- Jonice Gray Tucker to discuss "Making fair lending work for you" at the Mortgage Bankers Association Spring Conference & Expo
- Jonice Gray Tucker to discuss "Reading the tea leaves of President Biden’s initial financial appointees" at LendIt Fintech
- Moorari K. Shah to discuss “CA, NY, federal licensing and disclosure” at the Equipment Leasing & Finance Association Legal Forum
- Jonice Gray Tucker to discuss "Compliance under Biden" at the WSJ Risk & Compliance Forum
- Sherry-Maria Safchuk to discuss UDAAP at an American Bar Association webinar
- Jonice Gray Tucker to discuss “The future of fair lending” at the Mortgage Bankers Association Legal Issues and Regulatory Compliance Conference