Skip to main content
Menu Icon
Close

InfoBytes Blog

Financial Services Law Insights and Observations

Filter

Subscribe to our InfoBytes Blog weekly newsletter and other publications for news affecting the financial services industry.

  • Illinois Supreme Court says BIPA claims accrue with every transmission

    Privacy, Cyber Risk & Data Security

    On February 17, the Illinois Supreme Court issued a split decision holding that under the state’s Biometric Information Privacy Act (BIPA), claims accrue “with every scan or transmission of biometric identifiers or biometric information without prior informed consent.” The plaintiff filed a proposed class action alleging a defendant fast food chain violated BIPA sections 15(b) and (d) by unlawfully collecting her biometric data and disclosing the data to a third-party vendor without first obtaining her consent. According to the plaintiff, the defendant introduced a biometric-collection system that required employees to scan their fingerprints in order to access pay stubs and computers shortly after she began her employment in 2004. Under BIPA (which became effective in 2008), section 15(b) prohibits private entities from collecting, capturing, purchasing, receiving through trade, or otherwise obtaining “a person’s biometric data without first providing notice to and receiving consent from the person,” whereas Section 15(d) provides that private entities “may not ‘disclose, redisclose, or otherwise disseminate’ biometric data without consent.” While the plaintiff asserted that the defendant did not seek her consent until 2018, the defendant argued, among other things, that the action was untimely because the plaintiff’s claim accrued the first time defendant obtained her biometric data. In this case, defendant argued that plaintiff’s claim accrued in 2008 after BIPA’s effective date. Plaintiff challenged that “a new claim accrued each time she scanned her fingerprints” and her data was sent to a third-party authenticator, thus “rendering her action timely with respect to the unlawful scans and transmissions that occurred within the applicable limitations period.” The U.S. District Court for the Northern District of Illinois agreed with the plaintiff but certified its order for immediate interlocutory appeal after “finding that its decision involved a controlling question of law on which there is substantial ground for disagreement.”

    The U.S. Court of Appeals for the Seventh Circuit ultimately found that the parties’ competing interpretations of claim accrual were reasonable under Illinois law, and agreed that “the novelty and uncertainty of the claim-accrual question” warranted certification to the Illinois Supreme Court. The question certified to the high court asked whether “section 15(b) and (d) claims accrue each time a private entity scans a person’s biometric identifier and each time a private entity transmits such a scan to a third party, respectively, or only upon the first scan and first transmission[.]”

    The majority held that the plain language of the statute supports the plaintiff’s interpretation. “With the subsequent scans, the fingerprint is compared to the stored copy of the fingerprint. Defendant fails to explain how such a system could work without collecting or capturing the fingerprint every time the employee needs to access his or her computer or pay stub,” the high court said. The majority rejected the defendant’s argument that a BIPA claim is limited to the initial scan or transmission of biometric information since that is when the individual loses the right to control their biometric information “[b]ecause a person cannot keep information secret from another entity that already has it.” This interpretation, the majority wrote, wrongfully assumes that BIPA limits claims under section 15 to the first time a party’s biometric identifier or biometric information is scanned or transmitted. The Illinois Supreme Court further held that “[a]s the district court observed, this court has repeatedly held that, where statutory language is clear, it must be given effect, ‘even though the consequences may be harsh, unjust, absurd or unwise.’” However, the majority emphasized that BIPA does not contain language “suggesting legislative intent to authorize a damages award that would result in the financial destruction of a business,” adding that because “we continue to believe that policy-based concerns about potentially excessive damage awards under [BIPA] are best addressed by the legislature, . . . [w]e respectfully suggest that the legislature review these policy concerns and make clear its intent regarding the assessment of damages under [BIPA].”

    The dissenting judges countered that “[i]mposing punitive, crippling liability on businesses could not have been a goal of [BIPA], nor did the legislature intend to impose damages wildly exceeding any remotely reasonable estimate of harm.” “Indeed, the statute’s provision of liquidated damages of between $1000 and $5000 is itself evidence that the legislature did not intend to impose ruinous liability on businesses,” the dissenting judges wrote, cautioning that plaintiffs may be incentivized to delay bringing claims for as long as possible in an effort to increase actionable violations. Under BIPA, individuals have five years to assert violations of section 15—the statute of limitations recently established by a ruling issued by the Illinois Supreme Court earlier this month (covered by InfoBytes here).

    Privacy, Cyber Risk & Data Security Courts State Issues Illinois BIPA Enforcement Consumer Protection Class Action Appellate

  • EU says EU-US Data Privacy Framework lacks adequate protections

    Privacy, Cyber Risk & Data Security

    On February 14, the European Parliament’s Committee on Civil Liberties, Justice and Home Affairs released a draft motion for a resolution concerning the adequacy of protections afforded under the EU-US Data Privacy Framework. As previously covered by InfoBytes, last October President Biden signed an Executive Order on Enhancing Safeguards for United States Signals Intelligence Activities (E.O.) to address the facilitation of transatlantic data flows between the EU and the U.S. The E.O. also outlined bolstered commitments that the U.S. will take under the EU-U.S. Data Privacy Framework (a replacement for the EU-U.S. Privacy Shield). In 2020, the Court of Justice of the EU (CJEU) annulled the EU-U.S. Privacy Shield after determining that, because the requirements of U.S. national security, public interest, and law enforcement have “primacy” over the data protection principles of the EU-U.S. Privacy Shield, data transferred under the EU-U.S. Privacy Shield would not be subject to the same level of protections prescribed by the EU’s General Data Protection Regulation (GDPR).

    In the draft resolution, the Committee urged the European Commission not to adopt any new adequacy decisions needed for the EU-U.S. Data Privacy Framework to officially take effect. According to the Committee, the framework “fails to create actual equivalence in the level of protection” provided to EU residents’ transferred data. Among other things, the Committee found that the government surveillance backstops outlined in the E.O. “are not in line” with “long-standing key elements of the EU data protection regime as related to principles of proportionality and necessity.” The Committee also expressed concerns that “these principles will be interpreted solely in light of [U.S.] law and legal traditions” and appear to take a “broad interpretation” to proportionality. The Committee also flagged concerns that the framework does not establish an obligation to notify EU residents that their personal data has been processed, “thereby undermining their right to access or rectify their data.” Additionally, “the proposed redress process does not provide for an avenue for appeal in a federal court,” thereby removing the possibility for EU residents to claim damages. Moreover, “remedies available for commercial matters” are “largely left to the discretion of companies, which can select alternative remedy avenues such as dispute resolution mechanisms or the use of companies’ privacy [programs],” the Committee said.

    The Committee called on the Commission “to continue negotiations with its [U.S.] counterparts with the aim of creating a mechanism that would ensure such equivalence and which would provide the adequate level of protection required by Union data protection law and the Charter as interpreted by the CJEU,” and urged the Commission “not to adopt the adequacy finding.”

    Privacy, Cyber Risk & Data Security Of Interest to Non-US Persons EU Consumer Protection EU-US Data Privacy Framework Biden GDPR

  • Colorado releases privacy act updates

    Privacy, Cyber Risk & Data Security

    Last month, the Colorado attorney general released a third version of draft rules to implement and enforce the Colorado Privacy Act (CPA). A hearing on the proposed draft rules was held February 1. As previously covered by a Special Alert, the CPA was enacted in July 2021 to establish a framework for personal data privacy rights. The CPA, which is effective July 1, 2023 with certain opt-out provisions taking effect July 1, 2024, provides consumers with numerous rights, including the right to access their personal data, opt-out of certain uses of personal data, make corrections to personal data, request deletion of personal data, and obtain a copy of personal data in a portable format. Under the CPA, the attorney general has enforcement authority for the law, which does not have a private right of action. The attorney general also has authority to promulgate rules to carry out the requirements of the CPA and issue interpretive guidance and opinion letters, as well as the authority to develop technical specifications for at least one universal opt-out mechanism. The attorney general previously released two versions of the draft rules last year (covered by InfoBytes here and here).

    The third set of draft rules seeks to address additional concerns raised through public comments and makes a number of changes, including:

    • Clarifying definitions. The modifications add, delete, and amend several definitions, including those related to “bona fide loyalty program,” “information that a [c]ontroller has a reasonable basis to believe the [c]onsumer has lawfully made available to the general public,” “publicly available information,” “revealing,” and “sensitive data inference” or “sensitive data inferences.” Among other things, the definition of “publicly available information” has been narrowed by removing the exception to the definition that had excluded publicly available information that has been combined with non-publicly available information. Additionally, sensitive data inferences now refer to inferences which “are used to” indicate certain sensitive characteristics.
    • Right to opt out and right to access. The modifications outline controller requirements for complying with opt-out requests, including when opt-out requests must be completed, as well as provisions for how privacy notice opt-out disclosures must be sent to consumers, and how consumers are to be provided mechanisms for opting-out of the processing of personal data for profiling that results in the provision or denial of financial or lending services or other opportunities. With respect to the right to access, controllers must implement and maintain reasonable data security measures when processing any documentation related to a consumer’s access request.
    • Right to correct and right to delete. Among other changes, the modifications add language providing consumers with the right to correct inaccuracies and clarify that a controller “may decide not to act upon a [c]onsumer’s correction request if the [c]ontroller determines that the contested [p]ersonal [d]ata is more likely than not accurate” and has exhausted certain specific requirements. The modifications add requirements for when a controller determines that certain personal data is exempted from an opt-out request.
    • Notice and choice of universal opt-out mechanisms. The modifications specify that disclosures provided to consumers do not need to be tailored to Colorado or refer to Colorado “or to any other specific provisions of these rules or the Colorado Privacy Act examples.” Additionally, a platform, developer, or provider that provides a universal opt-out mechanism may, but is not required to, authenticate that a user is a resident of the state.
    • Controller obligations. Among other things, a controller may choose to honor an opt-out request received through a universal opt-out mechanism before July 1, 2024, may respond by choosing to opt a consumer out of all relevant opt-out rights should the universal opt-out mechanism be unclear, and may choose to authenticate that a user is a resident of Colorado but is not required to do so.
    • Purpose specification. The modifications state that controllers “should not specify so many purposes for which [p]ersonal [d]ata could potentially be processed to cover potential future processing activities that the purpose becomes unclear or uninformative.” Controllers must modify disclosures and necessary documentation if the processing purpose has “evolved beyond the original express purpose such that it becomes a distinct purpose that is no longer reasonably necessary to or compatible with the original express purpose.”
    • Consent. The modifications clarify that consent is not freely given when it “reflects acceptance of a general or broad terms of use or similar document that contains descriptions of [p]ersonal [d]ata [p]rocessing along with other, unrelated information.” Requirements are also provided for how a controller may proactively request consent to process personal data after a consumer has opted out.
    • User interface design, choice architecture, and dark patterns. The modifications provide that a consumer’s “ability to exercise a more privacy-protective option shall not be unduly longer, more difficult, or time-consuming than the path to exercise a less privacy-protective option.” The modifications also specify principles that should be considered when designing a user interface or a choice architecture used to obtain consent, so that it “does not impose unequal weight or focus on one available choice over another such that a [c]onsumer’s ability to consent is impaired or subverted.”

    Additional modifications have been made to personal data use limitations, technical specifications, public lists of universal opt-out mechanisms, privacy notice content, loyalty programs, duty of care, and data protection assessments. Except for provisions with specific delayed effective dates, the rules take effect July 1 if finalized.

    On February 28, the attorney general announced that the revised rules were adopted on February 23, but are subject to a review by the attorney general and may require additional edits before they can be finalized and published in the Colorado Register. 

    Privacy, Cyber Risk & Data Security State Issues State Attorney General Colorado Colorado Privacy Act Consumer Protection

  • California’s privacy agency finalizes CPRA regulations

    Privacy, Cyber Risk & Data Security

    On February 3, the California Privacy Protection Agency (CPPA) Board voted unanimously to adopt and approve updated regulations for implementing the California Privacy Rights Act (CPRA). The proposed final regulations will now go to the Office of Administrative Law, who will have 30 working days to review and approve or disapprove the regulations. As previously covered by InfoBytes, the CPRA (largely effective January 1, 2023, with enforcement delayed until July 1, 2023) was approved by ballot measure in November 2020 to amend and build on the California Consumer Privacy Act (CCPA). In July 2022, the CPPA initiated formal rulemaking procedures to adopt proposed regulations implementing the CPRA, and in November the agency posted updated draft regulations (covered by InfoBytes here and here).

    According to the CPPA’s final statement of reasons, the proposed final regulations (which are substantially similar to the version of the proposed regulations circulated in November) address comments received by stakeholders, and include the following modifications from the initial proposed text:

    • Amending certain definitions. The proposed changes would, among other things, modify the definition of “disproportionate effort” to apply to service providers, contractors, and third parties in addition to businesses, as such term is used throughout the regulations, to limit the obligation of businesses (and other entities) with respect to certain consumer requests. The term is further defined as “when the time and/or resources expended to respond to the request significantly outweighs the reasonably foreseeable impact to the consumer by not responding to the request,” and has been modified “to operationalize the exception to complying with certain CCPA requests when it requires ‘disproportionate effort.’” The proposed changes also introduce the definition of “unstructured” personal information, which describes personal information that could not be retrieved or organized in a predefined manner without disproportionate effort on behalf of the business, service provider, contractor, or third party as it relates to the retrieval of text, video, and audio files.
    • Outlining restrictions on how a consumer’s personal information is collected or used. The proposed changes outline factors for determining whether the collection or processing of personal information is consistent with a consumer’s “reasonable expectations.” The modifications also add language explaining how a business should “determine whether another disclosed purpose is compatible with the context in which the personal information was collected,” and present factors such as the reasonable expectation of the consumer at the time of collection, the nature of the other disclosed purpose, and the strength of the link between such expectation and the nature of the other disclosed purpose, for assessing compatibility. Additionally, a section has been added to reiterate requirements “that a business’s collection, use, retention, and/or sharing of a consumer’s personal information must be ‘reasonably necessary and proportionate’ for each identified purpose.” The CPPA explained that this guidance is necessary for ensuring that businesses do not create unnecessary and disproportionate negative impacts on consumers.
    • Providing disclosure and communications requirements. The proposed changes also introduce formatting and presentation requirements, clarifying that disclosures must be easy to read and understandable and conform to applicable industry standards for persons with disabilities, and that conspicuous links for websites should appear in a similar manner as other similarly-posted links, and, for mobile applications, that conspicuous links should be accessible in the business’ privacy policy.
    • Clarifying requirements for consumer requests and obtaining consumer consent. Among other things, the proposed changes introduce technical requirements for the design and implementation of processes for obtaining consumer consent and fulfilling consumer requests, including but not limited to “symmetry-in-choice,” which prohibits businesses from creating more difficult or time consuming paths for more privacy-protective options than paths to exercise a less privacy protective options. The modifications also provide that businesses should avoid choice architecture that impairs or interferes with a consumer’s ability to make a choice, as “consent” under the CCPA requires that it be freely give, specific, informed, and unambiguous. Moreover, the statutory definition of a “dark pattern” does not require that a business “intend to design a user interface to have the substantial effect of subverting or impairing consumer choice.” Additionally, businesses that are aware of, but do not correct, broken links and nonfunctional email addresses may be in violation of the regulation.
    • Amending business practices for handling consumer requests. The revisions clarify that a service provider and contractor may use self-service methods that enable the business to delete personal information that the service provider or contractor has collected pursuant to a written contract with the business (additional clarification is also provided on a how a service provider or contractor’s obligations apply to the personal information collected pursuant to its written contract with the business). Businesses can also provide a link to resources that explain how specific pieces of personal information can be deleted.
    • Amending requests to correct/know. Among other things, the revisions add language to allow “businesses, service providers, and contractors to delay compliance with requests to correct, with respect to information stored on archived or backup systems until the archived or backup system relating to that data is restored to an active system or is next accessed or used.” Consumers will also be required to make a good-faith effort to provide businesses with all necessary information available at the time of a request. A section has also been added, which clarifies “that implementing measures to ensure that personal information that is the subject of a request to correct remains corrected factors into whether a business, service provider, or contractor has complied with a consumer’s request to correct in accordance with the CCPA and these regulations.” Modifications have also been made to specify that a consumer can request that a business disclose their personal information for a specific time period, and changes have been made to provide further clarity on how a service provider or contractor’s obligations apply to personal information collected pursuant to a written contract with a business.
    • Amending opt-out preference signals. The proposed changes clarify that the requirement to process opt-out preference signals applies only to businesses that sell or share personal information. Language has also been added to explain that “the opt-out preference signal shall be treated as a valid request to opt-out of sale/sharing for any consumer profile, including pseudonymous profiles, that are associated with the browser or device for which the opt-out preference signal is given.” When consumers do not respond to a business’s request for more information, a “business must still process the request to opt-out of sale/sharing” to ensure that “a business’s request for more information is not a dark pattern that subverts consumer’s choice.” Additionally, business should not interpret the absence of an opt-out preference signal as a consumer’s consent to opt-in to the sale or sharing of personal information.
    • Amending requests to opt-out of sale/sharing. The revisions, among other things, clarify that, at a minimum, a business shall allow consumers to submit requests to opt-out of sale/sharing through an opt-out preference signal and through one of the following methods—an interactive form accessible via the “Do No Sell or Share My Personal Information” link, the Alternative Opt-out Link, or the business’s privacy policy. The revisions also make various changes related to service provider, contractor, and third-party obligations.
    • Clarifying requests to limit use and disclosure of sensitive personal information. The regulations require businesses to provide specific disclosures related to the collection, use, and rights of consumers for limiting the use of personal sensitive information in certain cases, including, among other things, requiring the use of a link to “Limit the Use of My Sensitive Personal Information” and honoring any limitations within 15 business days of receipt.  The regulations also provide specific enumerated business uses where the right to limit does not apply, including to ensure physical safety and to prevent, detect, and investigate security incidents.

    The proposed final regulations also clarify when businesses must provide a notice of right to limit, modify how the alternative opt-out link should be presented, provide clarity on how businesses should address scenarios in which opt-out preference signals may conflict with financial incentive programs, make changes to service provider, contractor, and third party obligations to the collection of personal information, as well as contract requirements, provide clarity on special rules applicable to consumers under 16-years of age, and modify provisions related to investigations and enforcement.

    Separately, on February 10, the CPPA posted a preliminary request for comments on cybersecurity audits, risk assessments, and automated decisionmaking to inform future rulemaking. Among other things, the CPPA is interested in learning about steps it can take to ensure cybersecurity audits are “thorough and independent,” what content should be included in a risk assessment (including whether the CPPA should adopt the approaches in the EU GDPR and/or Colorado Privacy Act), and how “automated decisionmaking technology” is defined in other laws and frameworks. The CPPA noted that this invitation for comments is not a proposed rulemaking action, but rather serves as an opportunity for information gathering. Comments are due March 27.

    Privacy, Cyber Risk & Data Security State Issues California CCPA CPPA CPRA Compliance State Regulators Opt-Out Consumer Protection

  • FTC bans health vendor from sharing consumer info with advertiser

    Federal Issues

    On February 1, the DOJ filed a complaint on behalf of the FTC against a telehealth and prescription drug discount provider for allegedly violating the FTC Act and the Health Breach Notification Rule by failing to notify consumers that it was disclosing their personal health information to third parties for advertising purposes. As a vendor of personal health records, the FTC stated that the company is required to comply with the Health Breach Notification Rule, which imposes certain reporting obligations on health apps and other companies that collect or use consumers’ health information (previously covered by InfoBytes here).

    According to the complaint filed in the U.S. District Court for the Northern District of California, the company—which allows users to keep track of their personal health information, including saving, tracking, and receiving prescription alerts—shared sensitive personal health information with advertisers and other third parties for years, even though it allegedly promised users that their health information would never be shared. The FTC maintained that the company also monetized users’ personal health information and used certain shared data to target its own users with personalized health- and medication-specific advertisement on various social media platforms. The company also allegedly: (i) permitted third parties to use shared data for their own internal purposes; (ii) falsely claimed compliance with the Digital Advertising Alliance principles (which requires companies to obtain consent prior to using health information for advertising purposes); (iii) misrepresented its HIPAA compliance; (iv) failed to maintain sufficient formal, written, or standard privacy or data sharing policies or procedures to protect personal health information; and (v) failed to report the unauthorized disclosures.

    Under the terms of the proposed court order filed by the DOJ, the company would be required to pay a $1.5 million civil penalty, and would be prohibited from engaging in the identified alleged deceptive practices and from sharing personal health information with third parties for advertising purposes. The company would also be required to implement several measures to address the identified violations, including obtaining users’ affirmative consent before disclosing information to third parties (the company would be prohibited from using “dark patterns,” or manipulative designs, to obtain consent), directing third parties to delete shared data, notifying users about the breaches and the FTC’s enforcement action, implementing a data retention schedule, and putting in place a comprehensive privacy program to safeguard consumer data.

    Federal Issues FTC Enforcement Privacy, Cyber Risk & Data Security Advertisement Consumer Protection FTC Act Health Breach Notification Rule Dark Patterns

  • California investigating mobile apps’ CCPA compliance

    Privacy, Cyber Risk & Data Security

    On January 27, the California attorney general announced an investigation into mobile applications’ compliance with the California Consumer Privacy Act (CCPA). The AG sent letters to businesses in the retail, travel, and food service industries who maintain popular mobile apps that allegedly fail to comply with consumer opt-out requests or do not offer mechanisms for consumers to delete personal information or stop the sale of their data. The investigation also focuses on businesses that fail to process consumer opt-out and data-deletion requests submitted through an authorized agent, as required under the CCPA. “On this Data Privacy Day and every day, businesses must honor Californians’ right to opt out and delete personal information, including when those requests are made through an authorized agent,” the AG said, adding that authorized agent requests include “those sent by Permission Slip, a mobile application developed by Consumer Reports that allows consumers to send requests to opt out and delete their personal information.” The AG encouraged the tech industry to develop and adopt user-enabled global privacy controls for mobile operating systems to enable consumers to stop apps from selling their data.

    As previously covered by InfoBytes, the CCPA was enacted in 2018 and took effect January 1, 2020. The California Privacy Protection Agency is currently working on draft regulations to implement the California Privacy Rights Act, which largely became effective January 1, to amend and build upon the CCPA. (Covered by InfoBytes here.)

    Privacy, Cyber Risk & Data Security State Issues State Attorney General California CCPA Compliance Opt-Out Consumer Protection CPRA

  • FCC warns telecoms to stop carrying “mortgage scam” robocalls

    Federal Issues

    On January 24, the FCC’s Enforcement Bureau announced it had ordered telecommunications companies to effectively mitigate robocall traffic originating from a Florida-based real estate brokerage firm selling mortgage scams. The FCC also sent a cease-and-desist letter to a voice service provider carrying the allegedly illegal robocall traffic. According to the FCC, several state attorneys general filed lawsuits late last year against the firm for allegedly using “misleading robocalls to ‘swindle’ and ‘scam’ residents into mortgaging their homes in exchange for small cash payments.” (See state AG press releases here, here, and here.) Additionally, last month, Senate Banking Committee Chairman Sherrod Brown (D-OH), along with Senators Tina Smith (D-MN) and Ron Wyden (D-OR) sent a letter to the FTC and the CFPB requesting a review of the firm’s use of exclusive 40-year listing agreements marketed as a “loan alternative.” (Covered by InfoBytes here.) In shutting down the robocalls, FCC Chairwoman Jessica Rosenworcel stressed that sending junk calls to financially-stressed homeowners in order to offer “deceptive products and services is unconscionable.” Enforcement Bureau Chief Loyaan A. Egal added that the voice service provider should have been applying “Know Your Customer” principles before allowing the traffic on its networks.

    Federal Issues FCC Robocalls Consumer Finance Mortgages Consumer Protection Enforcement State Issues State Attorney General Listing Agreement

  • U.S. messaging service fined €5.5 million for GDPR violations

    Privacy, Cyber Risk & Data Security

    On January 19, the Irish Data Protection Commission (DPC) announced the conclusion of an inquiry into the data processing practices of a U.S.-based messaging service’s Ireland operations and fined the messaging service €5.5 million. The investigation was part of a broader GDPR compliance inquiry prompted by a May 25, 2018 complaint from a German data subject.

    The DPC noted that in advance of the date on which the GDPR became effective (May 25, 2018), the U.S. company updated its terms of service and notified users that, to continue accessing the messaging service, they would need to accept the updated terms by clicking “agree and continue.” The complainant asserted that, in doing so, the messaging service forced users to consent to the processing of their personal data for service improvement and security. 

    The company claimed that when a user accepted the updated terms of service, the user entered into a contract with the company. The company therefore maintained that “the processing of users’ data in connection with the delivery of its service was necessary for the performance of that contract, to include the provision of service improvement and security features, so that such processing operations were lawful by reference to Article 6(1)(b) of the GDPR (the ‘contract’ legal basis for processing).” The complainant argued that, contrary to the company’s stated intention, the company was “seeking to rely on consent to provide a lawful basis for its processing of users’ data.”

    The DPC issued a draft decision that was submitted to its EU peer regulators (Concerned Supervisory Authorities or “CSAs”). The DPC concluded that the company was in breach of its GDPR transparency obligations under Articles 12 and 13(1)(c), and stated that users had “insufficient clarity as to what processing operations were being carried out on their personal data.” With respect to whether the company was obliged to rely on consent as its legal basis in connection with the delivery of the service (including for service improvement and security purposes), the DPC disagreed with the complainant’s “forced consent” argument, finding that the company was not required to rely on user consent as providing a lawful basis for its processing of their personal data.

    Noting that DPC had previously imposed a €225 million fine against the company last September for breaching its transparency obligations to users about how their information was being disclosed over the same time period (covered by InfoBytes here), the DPC did not propose an additional fine. Six of the 47 CSAs, however, objected to the DPC’s conclusion as to the “forced consent” aspect of its decision, arguing that the company “should not be permitted to rely on the contract legal basis on the basis that the delivery of service improvement and security could not be said to be necessary to perform the core elements of what was said to be a much more limited form of contract.”

    The dispute was referred to the European Data Protection Board (EDPB), which issued a final decision on January 12, where it found that, “as a matter of principle, [the company] was not entitled to rely on the contract legal basis as providing a lawful basis for its processing of personal data for the purposes of service improvement and security,” and that in doing so, the company contravened Article 6(1) of the GDPR.

    The DPC handed down a €5.5 million administrative fine and ordered the company to bring its processing operations into compliance with the GDPR within a six-month period. Separately, the EDPB instructed the DPC “to conduct a fresh investigation” that would span all of the company’s processing operations to determine whether the company is in compliance with relevant GDPR obligations regarding the processing of personal data for behavioral advertising, marketing purposes, the provisions of metrics to third parties, and the exchange of data with affiliated companies for the purpose of service improvements.

    The DPC challenged the EDPB’s decision, stating that the board “does not have a general supervision role akin to national courts in respect of national independent authorities, and it is not open to the EDPB to instruct and direct an authority to engage in open-ended and speculative investigation.” The DPC suggested that it is considering bringing an action before the Court of Justice of the European Union to “seek the setting aside of the EDPB’s direction.”

    Privacy, Cyber Risk & Data Security Of Interest to Non-US Persons Ireland Enforcement Consumer Protection EU GDPR

  • CFPB proposes T&C registry for nonbanks

    Agency Rule-Making & Guidance

    On January 11, the CFPB announced a proposed rule to create a public registry of terms and conditions used in non-negotiable, “take it or leave it” nonbank form contracts that “claim to waive or limit consumer rights and protections.” Under the proposal, supervised nonbank companies would be required to report annually to the Bureau on their use of standard-form contract terms that “seek to waive consumer rights or other legal protections or limit the ability of consumers to enforce or exercise their rights.” The terms and conditions—which would be made publicly available—would include those that address waivers of consumer claims, liability limits, legal action limits, class action bans, arbitration agreements, liquidated damages clauses, as well as other waivers of consumer rights.

    The Bureau explained that its proposal is intended to “facilitate public awareness and oversight” about what nonbanks are putting in form contracts. “Some companies slip terms and conditions into their form contracts that try to take away consumer protections, try to limit how consumers exercise their rights, or try to quiet consumer complaints or criticism,” the Bureau stated in its announcement. “[M]ore broadly, the terms and conditions potentially undermine consumer financial protection law.”

    The Bureau provided several examples of such terms and conditions, including: (i) unlawful mandatory arbitration agreements that are included in servicemember loan contracts; (ii) credit monitoring service agreements that “undermine credit reporting rights” by prohibiting consumers from pursuing legal action, including class action lawsuits, for FCRA violations; (iii) occurrences where lenders use clauses that waive liability for bank fees that borrowers incur due to repeated payment collection attempts; (iii) mortgage contracts that make “deceptive” use of waivers and limitations that are inconsistent with TILA restrictions; and (v) terms and conditions that try to quiet consumer complaints or criticism.

    All supervised nonbanks, including those operating in payday lending, private student loan origination, mortgage lending and servicing, student loan servicing, automobile financing, consumer reporting, consumer debt collection, and international remittances would be subject to the rule. However, the Bureau is proposing certain exemptions for nonbanks with lower levels of receipts. Comments on the proposal are due 30 days after publication in the Federal Register.

    “[T]the registry would help regulators and law enforcement more easily detect when companies are offering products and services using prohibited, void, and restricted contract terms described above. This would be especially useful to state and tribal regulators with limited resources to alert or take action against companies violating the law,” CFPB Director Rohit Chopra said in an accompanying statement, adding that the Bureau plans to “use data from the registry to identify supervised nonbanks and the risks their terms and conditions pose, prioritize which firms to examine, and plan the scope of those exams.”

    House Financial Services Committee Chairman Patrick McHenry (R-NC) slammed the proposal, saying the “proposed registry of terms and conditions will facilitate the naming and shaming of firms to empower progressive activists. Requiring nonbank financial firms to register publicly with the Bureau is unprecedented—no other industry is required to make public such detailed contract information. The days of Congress giving Director Chopra a free pass for his reckless actions have come to an end.”

    The proposed registry follows a proposal announced in December by the Bureau that would create a database of enforcement actions taken against certain nonbank covered entities, which would include all final public written orders and judgments (including any consent and stipulated orders and judgments) obtained or issued by any federal, state, or local government agency for violation of certain consumer protection laws related to unfair, deceptive, or abusive acts or practices. (Covered by InfoBytes here.)

    Agency Rule-Making & Guidance Federal Issues CFPB Nonbank Consumer Finance Consumer Protection Supervision House Financial Services Committee

  • Agencies highlight downpayment assistance, child privacy in regulatory agendas

    Agency Rule-Making & Guidance

    Recently, the Office of Information and Regulatory Affairs released fall 2022 regulatory agendas for the FTC and HUD. With respect to an FTC review of the Children’s Online Privacy Protection Rule (COPPA) that was commenced in 2019 (covered by InfoBytes here), the Commission stated in its regulatory agenda that it is still reviewing comments. COPPA “prohibits unfair or deceptive acts or practices in connection with the collection, use and/or disclosure of personal information from and about children under the age of 13 on the internet,” and, among other things, “requires operators of commercial websites and online services, with certain exceptions, to obtain verifiable parental consent before collecting, using, or disclosing personal information from or about children.”

    HUD stated in its regulatory agenda that it anticipates issuing a notice of proposed rulemaking in March that would address mortgage downpayment assistance programs. The Housing and Economic Recovery Act of 2018 amended the National Housing Act to add a clause that prohibits any portion of a borrower’s required minimum cash investment from being provided by: “(i) the seller or any other person or entity that financially benefits from the transaction, or (ii) any third party or entity that is reimbursed, directly or indirectly, by any of the parties described in clause (i).” According to the agenda, FHA continues to receive questions about prohibitions on persons or entities that may financially benefit from a mortgage transaction, including “whether down payment assistance programs operated by government entities are being operated in a fashion that would render such assistance prohibited.” A future NPRM would clarify the circumstances in which government entities are deriving a prohibited financial benefit.

    Agency Rule-Making & Guidance Federal Issues FTC HUD COPPA Downpayment Assistance Mortgages Privacy, Cyber Risk & Data Security Consumer Protection FHA

Pages

Upcoming Events