Subscribe to our InfoBytes Blog weekly newsletter and other publications for news affecting the financial services industry.
On October 5, the California governor signed AB 694. The bill clarifies that the California Privacy Protection Agency (which was given “full administrative power, authority, and jurisdiction to implement and enforce the [California Consumer Privacy Act]”) would assume responsibility for rulemaking “on or after the later of July 1, 2021, or within six months of the agency providing the Attorney General with notice that it is prepared to assume rulemaking.” A previously covered by InfoBytes, last month the CPPA formally called on stakeholders to provide preliminary comments on proposed Consumer Privacy Rights Act rulemaking. However, the CPPA noted that the invitation for comments is not a proposed rulemaking action and stated that the public will have additional opportunities to provide comments on proposed regulations or modifications when it proceeds with a notice of proposed rulemaking action.
On October 6, the California governor signed SB 41, which requires direct-to-consumer genetic testing companies to provide consumers with information about the collection, use, maintenance, and disclosure of genetic data. Under the Genetic Information Privacy Act (GIPA), companies are required to honor a consumer’s revocation of consent and destroy a consumer’s biological sample within 30 days after the consent has been revoked. Companies must also obtain a consumer’s express consent for collection, use, or disclosure of an individual’s genetic data. GIPA also requires companies to comply with all applicable federal and state laws for disclosing genetic data without a consumer’s express consent, and companies must “implement and maintain reasonable security procedures and practices to protect a consumer’s genetic data against unauthorized access, destruction, use, modification, or disclosure, and develop procedures and practices to enable a consumer to access their genetic data, and to delete their account and genetic data, as specified.” Violations of the law may result in civil penalties ranging from $1,000 to $10,000. Exempt from GIPA’s provisions is medical information governed by the Confidentiality of Medical Information Act, or medical information collected and used by business associates of a covered entity governed by the privacy, security, and data breach notification rules issued by the U.S. Department of Health and Human Services.
Earlier on October 5, the governor also signed AB 825, which expands the definition of “personal information” to include genetic data, regardless of its format. Under existing law, any agency that owns or licenses computerized data that includes personal information is required to immediately disclose a security breach upon discovery to California residents who may have been impacted. Agencies are also required to implement and maintain reasonable security procedures and practices.
Both bills take effect January 1, 2022.
According to sources, Ashkan Soltani, a former chief technologist at the FTC, has been named Executive Director of the California Privacy Protection Agency (CPPA). Among other things, Soltani was an architect of the California Consumer Privacy Act (CCPA). According to CPPA Chair Jennifer Urban, Soltani’s “background in technology and privacy, and his work on both the CCPA and the [California Privacy Rights Act (CPRA)] give him a thorough understanding of California privacy law and will stand him in good stead as he leads Agency staff and helps the Agency fulfill its privacy protection mandate.” As previously covered by InfoBytes, earlier this year, California’s governor announced appointments to the five-member inaugural board for the CPPA, consisting of experts in privacy, technology, and consumer rights. The CPPA is tasked with protecting the privacy rights of consumers over their personal information, and “will have full administrative power, authority, and jurisdiction to implement and enforce” the CCPA and the CPRA, including bringing enforcement actions before an administrative law judge.
On September 22, the California Privacy Protection Agency (CPPA) formally called on stakeholders to provide preliminary comments on proposed rulemaking under the California Privacy Rights Act (CPRA). The CPRA, which established the CPPA to administer, implement, and enforce the act, was approved by ballot measure in November 2020 (covered by InfoBytes here) and updated the existing California Consumer Privacy Act. The invitation for comments highlights several areas of interest for the CPPA as it begins the rulemaking process, including topics related to: (i) cybersecurity audits and risk assessments to be performed by businesses processing personal information that presents a significant risk to consumers’ privacy or security; (ii) matters concerning automated decision-making; (iii) audits performed by the CPPA; (iv) issues related to consumer rights, including consumers’ right to delete, right to correct, and right to know what personal data has been collected or shared, as well as consumers’ rights to opt-out of the selling or sharing of their personal information and to limit the use and disclosure of their sensitive personal information; (v) information to be provided when responding to a consumer’s request to know; and (vi) definitions and categories of information and activities, including what updates or additions should be added to “personal information,” “sensitive personal information,” “precise geolocation,” and “dark patterns,” among other terms. Comments must be submitted by November 8.
The CPRA will become effective January 1, 2023, with enforcement delayed until July 1, 2023. However, the CPRA will apply to personal information collected by a business on or after January 1, 2022. The CPPA notes that this invitation for comments is not a proposed rulemaking action and states that the public will have additional opportunities to provide comments on proposed regulations or modifications when it proceeds with a notice of proposed rulemaking action.
On September 17, the First District Appellate Court of Illinois held that different limitation periods should be applied to the Biometric Information Privacy Act (BIPA), concluding that while Section 15 imposes various duties that all concern privacy, “each duty is separate and distinct.” Specifically, the panel stated that claims related to “[a]ctions for slander, libel or for publication of matter violating the right of privacy” have a one-year limitation period, while “all civil actions not otherwise provided for” carry a five-year limit. Plaintiffs filed a class action complaint alleging violations of BIPA Sections 15(a), 15(b), and 15(d), claiming the defendant collected, stored, used, and disseminated individuals’ biometric data obtained through fingerprint scans without, among other things, (i) informing plaintiffs of the purpose and length of the storage and use of their data; (ii) receiving written release from plaintiffs; (iii) providing a retention schedule and guidelines for destroying the data; or (iv) obtaining consent from plaintiffs and other employees to disseminate their data to third parties. The defendant moved to dismiss, arguing that the claims were filed outside the limitation period, noting that while BIPA itself has no limitation provision, “the one-year limitation period for privacy actions under Code section 13-201 applies to causes of action under [BIPA] because [BIPA’s] purpose is privacy protection.” A state trial court denied the defendant’s motion to dismiss, ruling that the plaintiffs’ claims were subject to Illinois’ “catchall” five-year limitation provision rather than the state’s one-year privacy claim limitation period, since the plaintiffs were alleging specific BIPA violations rather than a general privacy invasion.
On appeal, the appellate court considered the limitations question and determined, among other things, that since Illinois’ one-year statute of limitations applies only to published privacy violations, it can only govern BIPA claims filed under section 15(c)’s profit restrictions and section 15(d)’s disclosure/dissemination prohibitions. As such, plaintiffs suing under BIPA’s section 15(a)’s retention requirements, section 15(b) informed consent, and section 15(e) data safeguarding requirements have five years to bring such claims since these duties “have absolutely no element of publication or dissemination.”
On September 15, the FTC warned health apps and connected devices collecting or using consumers’ health information that they must comply with the FTC’s Health Breach Notification Rule (Rule). The Rule requires companies to notify consumers and others if consumers’ health data is breached, and ensures that entities not covered by HIPPA are held accountable in the event of a security breach. Companies that fail to comply with the Rule may be subject to monetary penalties of up to $43,792 per violation per day. The FTC’s policy statement (approved by a 3-2 vote) clarifies the Rule’s scope and puts companies on notice of their reporting obligations. According to the FTC, health apps that are increasingly collecting sensitive and personal data from consumers have a responsibility to ensure the collected data is secured from unauthorized access. However, the FTC expressed concern that there are still few applicable privacy protections. “While this Rule imposes some measure of accountability on tech firms that abuse our personal information, a more fundamental problem is the commodification of sensitive health information, where companies can use this data to feed behavioral ads or power user analytics,” FTC Chair Lina M. Khan stated. “Given the growing prevalence of surveillance-based advertising, the Commission should be scrutinizing what data is being collected in the first place and whether particular types of business models create incentives that necessarily place users at risk.”
On September 2, the Irish Data Protection Commission (Commission) announced that a final decision was reached in a General Data Protection Regulation (GDPR) investigation into a U.S.-based messaging service’s handling of individuals’ personal information. The final Article 65 decision, published by the European Data Protection Board (EDPB), imposes a €225 million on the company, and resolves an investigation into whether the company met its transparency obligations with respect to its data processing activities. The Commission alleged that the company violated provisions of the GDPR through the way it processed users’ and non-users’ data, as well as in the way it processed and shared data with other companies’ owned by the parent global social media company.
According to the final decision, “a number of concerned supervisory authorities” raised objections to aspects of the draft decision, taking issue, among other things, with the size of the proposed fine, which was originally set between €30 and €50 million. Because the Commission was unable to reach a consensus with the objecting concerned supervisory authorities, a dispute resolution process was triggered. The EDPB ultimately ordered the Commission to reassess and increase its proposed fine. In addition to imposing the administrative fine, the Commission also ordered the company “to bring its processing into compliance by taking a range of specified remedial actions.”
On August 25, the New Mexico attorney general filed a lawsuit against an entertainment corporation for allegedly violating the Children’s Online Privacy Protection Act Rule (COPPA) and New Mexico’s Unfair Practices Act by knowingly collecting and selling personal information from children under the age of 13 without verifiable parental consent. According to the AG, the company purportedly collects data from children who play one of its gaming apps and sells it to third-party marketing companies, who in turn, analyze and repackage the data to sell targeted advertisements to those same children. The complaint stated that, “[t]his conduct endangers the children of New Mexico, undermines the ability of their parents to protect children and their privacy, and violates state and federal law,” adding that the “surreptitious and intentional monitoring, tracking, and profiling of children—in direct violation not only of federal law but of longstanding societal norms—is egregious and highly offensive conduct.” The AG further emphasized that even if a game is targeted towards a broad audience, developers must still ensure that data is not collected from users under the age of 13 without parental consent. The complaint seeks an injunction to prohibit the company’s data collection practices as well as civil penalties and restitution.
On August 23, the U.S. Treasury Department and the Monetary Authority of Singapore finalized a bilateral Memorandum of Understanding (MOU) on cybersecurity cooperation. The MOU formalizes and strengthens a strong cybersecurity partnership between the two countries and, among other things, enhances cooperation in the following areas: (i) “[i]nformation sharing relating to the financial sector including cybersecurity regulations and guidance, cybersecurity incidents, and cybersecurity threat intelligence”; (ii) “[s]taff training and study visits to promote cooperation in the area of cybersecurity”; and (iii) “[c]ompetency-building activities such as the conduct of cross-border cybersecurity exercises.” According to Treasury Secretary Janet L. Yellen, the MOU serves “to improve the cyber resilience of both countries’ financial systems.”
Recently, a global technology corporation disclosed a $746 million euro (approximately $888 million USD) fine issued by the Luxembourg National Commission for Data Protection (CNPD) for alleged violations of the EU’s General Data Protection Regulations (GDPR). The corporation’s Form 10-Q for second quarter 2021 states that on July 16, the CNPD issued a decision against the corporation’s European headquarters, claiming its “processing of personal data did not comply with the [GDPR].” In addition to the fine, the decision also requires corresponding practice revisions, the details of which were not disclosed. The corporation noted that the decision is “without merit” and stated it intends to defend itself “vigorously” in this matter. According to sources, the decision follows an investigation started in 2018 when a French privacy group claiming to represent the interests of Europeans filed complaints against several large technology companies to ensure European consumer data is not manipulated for commercial or political purposes.
- Daniel R. Alonso to moderate an interactive roundtable at the Latin Lawyer and GIR Connect: Anti-Corruption & Investigations Conference
- APPROVED Checkpoint Webcast: You have license renewal questions, we have answers
- Jonice Gray Tucker to discuss “Fintech trends” at the BIHC Network Elevating Black Excellence Regional Summit
- Jeffrey P. Naimon to discuss "Truth in lending” at the American Bar Association National Institute on Consumer Financial Services Basics
- Daniel R. Alonso to discuss anti-money-laundering at FELABAN Spanish-language webinar “Perspective for banks: LAFT, FINCEN, OFAC, Cryptocurrency”
- Daniel R. Alonso to discuss "What’s new in BSA/AML compliance?" at the Institute of International Bankers Regulatory Compliance Seminar
- Jon David D. Langlois to discuss "Regulatory update: What you need to know under the new boss; It won’t be the same as the old boss" at the IMN Residential Mortgage Service Rights Forum (East)
- Benjamin B. Klubes to discuss “Creating a Fantastic Workplace Culture”
- John R. Coleman and Amanda R. Lawrence to discuss “Consumer financial services government enforcement actions – The CFPB and beyond” at the Government Investigations & Civil Litigation Institute Annual Meeting
- Jonice Gray Tucker to discuss "Consumer financial services" at the Practising Law Institute Banking Law Institute
- Jonice Gray Tucker to discuss “Regulators always ring twice: Responding to a government request” at ALM Legalweek