Skip to main content
Menu Icon
Close

InfoBytes Blog

Financial Services Law Insights and Observations

Filter

Subscribe to our InfoBytes Blog weekly newsletter and other publications for news affecting the financial services industry.

  • Colorado releases draft Colorado Privacy Act rules

    Privacy, Cyber Risk & Data Security

    On September 29, the Colorado attorney general published proposed draft Colorado Privacy Act (CPA) rules with the Colorado Department of Regulatory Agencies. (See Colorado Register here.) As covered by a Buckley Special Alert, the CPA was enacted last July to establish a framework for personal data privacy rights. The CPA provides consumers with numerous rights, including the right to access their personal data, opt-out of certain uses of personal data, make corrections to personal data, request deletion of personal data, and obtain a copy of personal data in a portable format. The CPA is effective July 1, 2023 with certain opt-out provisions taking effect July 1, 2024. Under the CPA, the AG has enforcement authority for the law, which does not have a private right of action. The AG also has authority to promulgate rules to carry out the requirements of the CPA and issue interpretive guidance and opinion letters, as well as the authority to develop technical specifications for at least one universal opt-out mechanism.

    Pre-rulemaking considerations were released in April, where the AG’s office stated that it planned to adopt a principle-based model for the state’s rulemaking approach, rather than a prescriptive one (covered by InfoBytes here). Comments received on the pre-rulemaking considerations, as well as feedback received during two public listening sessions, were considered when drafting the proposed rules. The AG’s office explained that when considering feedback it sought to clarify the CPA, simplify compliance, and ensure consumer privacy rights granted by the statute are protected, while also attempting to create a legal framework that “does not overly burden technological innovation” while operating in conjunction with other national, state, and international data privacy laws.

    • Definitions. The proposed rules add new terms aside from those already set forth in the CPA. These include terms related to biometric data and identifiers (including behavioral characteristics), bona fide loyalty programs, data brokers, automated processing, publicly available data, opt-out purposes and mechanisms, sensitive data inferences, and solely automated processing. The term “sensitive data inferences” indicates an individual’s racial or ethnic origin, religious beliefs, mental or physical health condition or diagnosis, sex life or sexual orientation, or citizenship or citizenship status. Controllers must obtain consent to process sensitive data inferences unless they meet specific requirements. Additionally, controllers must comply with certain retention and deletion requirements for this type of information.
    • Disclosures. The proposed rules provide that disclosures, notifications, and other communications to consumers must be clear, accessible, and understandable, and must be available in the languages in which the controller would ordinarily do business, as well as be accessible to consumers with disabilities (online notices should generally follow recognized industry standards such as version 2.1 of the Web Content Accessibility Guidelines).
    • Consumer personal data rights. The proposed rules outline requirements for submitting data rights requests, including through online and in-person methods, and requires controllers to use reasonable data security measures when exchanging information. Among other things, requests should be easy to execute, require a minimal number of steps, and not require a consumer to create a new user account. Notably, a data rights request method does not have to be specific to Colorado, provided it “clearly indicates which rights are available to Colorado consumers.” Controllers must also provide instructions on how to appeal a data rights request decision.
    • Opt-out rights and mechanisms. Under the proposed rules, controllers must cease processing a consumer’s personal data for opt-out purposes as soon as feasibly possible but no later than 15 days after the request is received (authorized agents may exercise a consumer’s opt-out right provided certain criteria is met). A record of opt-out requests and responses also must be maintained. Clear and conspicuous opt-out methods must be provided in a controller’s privacy notice, as well as in a readily accessible location outside the privacy notice “at or before the time” the personal data is processed for opt-out purposes. The proposed rules also provide that the Colorado Department of Law will maintain a public list of universal opt-out mechanisms that have been recognized by the AG’s office as meeting the required standards. The proposed rules also provide details for deployment, and state that ease of use, implementation, and detection, among other factors will be considered when determining which universal opt-out mechanisms will be recognized. Additionally, the proposed rules state that a universal opt-out mechanism may also be a “do not sell list” that controllers query in an automated manner.
    • Right of access, and right to correction, deletion, and data portability. The proposed rules outline controller requirements for handling consumers’ requests to access, correct, or delete their personal data, as well as instructions for complying with data portability requests. The proposed rules also consider instances where personal data may be corrected more quickly and easily through account settings than through the data rights review process.
    • Data minimization. Under the proposed rules, controllers would be required to “specify the express purposes” for which personal data is collected and processed in a manner that is “sufficiently unambiguous, specific, and clear.” Controllers must also consider each processing activity to determine whether it meets the requirement to use only the minimum personal information necessary, adequate, or relevant for the express purpose.
    • Data protection assessments. The proposed rules provide a list of 18 elements for controllers to include when assessing whether a processing activity presents a “heightened risk of harm,” including the specific purpose of the processing activity, procedural safeguards, alternative processing activities, discrimination harms, and the dates the assessment was reviewed and approved. The proposed rules also require that these assessments be revisited and updated at least annually in certain instances for fairness and disparate impact. Assessments are required for activities conducted after July 1, 2023, and are not retroactive.
    • Profiling. Under the proposed rules, controllers are obligated to clearly inform consumers when their personal data is being used for profiling. Consumers must also have the right to opt out of profiling in connection with decisions that result in legal or similar effects on consumers, and controllers that engage in profiling must provide additional disclosures in their privacy notices. A controller may deny a consumer’s request to opt out if there is human involvement in the automated processing, but is required to provide additional notice in such cases.

    The proposed rules also contain provisions addressing requirements for refreshing consent, how data right requests impact loyalty programs and the disclosures that are required for these programs, and how a consumer’s right to delete might impact a controller’s ability to provide program benefits.

    Comments on the proposed rules will be accepted between October 10 and February 1, 2023. On February 1, a proposed rulemaking public hearing will be held to hear testimony from stakeholders.

    Privacy, Cyber Risk & Data Security State Issues Colorado Colorado Privacy Act State Attorney General Consumer Protection

  • District Court grants preliminary approval of data breach class action

    Courts

    On October 3, the U.S. District Court for the Eastern District of Wisconsin granted preliminary approval of a data breach class action settlement. According to the plaintiff’s unopposed motion for preliminary approval, a ransomware attack on the company potentially allowed an unauthorized actor to access the personal information of approximately two million of the company’s patients, employees, employee beneficiaries, and other individuals from May 28, 2021 to June 4, 2021. The company announced the ransomware attack in a data breach notice sent to customers on June 24, 2021. The plaintiff filed her complaint alleging, among other things, that the company “failed to take adequate measures to protect her and other putative Class Members’ Personal Information and failed to disclose that [the company’s] systems were susceptible to a cyberattack.” After other plaintiffs filed suit, the plaintiffs moved to consolidate the actions and alleged several violations, including negligence and breach of implied contract. The settlement provides for a $3.7 million settlement fund. Each class member is eligible to submit a claim for two years of three-bureau credit monitoring and up to $1 million of insurance coverage for identity theft incidents. Additionally, class members can submit a claim for up to $10,000 in documented losses. The settlement also provides class members with lost time payment and cash fund payment options (in the alternative to all the foregoing settlement benefits).

    Courts Privacy, Cyber Risk & Data Security Class Action Settlement Data Breach

  • Arizona reaches $85 million settlement in location tracking suit

    Privacy, Cyber Risk & Data Security

    On October 4, the Arizona attorney general announced an $85 million settlement with an internet technology company to resolve allegations that it collected individuals’ location data for targeted advertising without users’ knowledge or consent or after users opted out of the feature through the platform’s settings. The AG initiated an investigation in 2018 into the company’s practices after sources claimed that the platform surreptitiously collected and sold location information through other settings even though users believed disabling the “Location History” setting would ensure this would not occur. The AG sued the company in 2020, claiming violations of the Arizona Consumer Fraud Act. Among other things, the AG alleged the company’s disclosures misled users into believing these other settings had nothing to do with tracking user location, and that the company used “deceptive and unfair practices to collect as much user information as possible” and made it difficult for users to understand what was being done with their data or opt out of data sharing. Without admitting any wrongdoing, the company agreed to the terms of the settlement agreement and will pay Arizona $85 million, of which the majority will go toward “education, broadband, and [i]nternet privacy efforts and purposes.”

    Privacy, Cyber Risk & Data Security State Issues Arizona Settlement State Attorney General

  • White House proposes AI “Bill of Rights”

    Federal Issues

    Recently, the Biden administration’s Office of Science and Technology Policy released a Blueprint for an AI Bill of Rights. The blueprint’s proposed framework identifies five principles for guiding the design, use, and deployment of automated systems to protect the public as the use of artificial intelligence grows. The principles center around topics related to stronger safety measures, such as (i) ensuring systems are safe and effective; (ii) implementing proactive protections against algorithmic discrimination; (iii) incorporating built-in privacy protections, including providing the public control over how data is used and ensuring that the data collection meets reasonable expectations and is necessary for the specific context in which it is being collected; (iv) providing notice and explanation as to how an automated system is being used, as well as the resulting outcomes; and (v) ensuring the public is able to opt out from automated systems in favor of a human alternative and has access to a person who can quickly help remedy problems. According to the announcement, the proposed framework’s principles should be incorporated into policies governing systems with “the potential to meaningfully impact” an individual or community’s rights or access to resources and services related to education, housing, credit, employment, health care, government benefits, and financial services, among others.

    Federal Issues Privacy, Cyber Risk & Data Security Biden Artificial Intelligence Fintech

  • District Court grants plaintiff’s injunction in data scraping suit

    Courts

    On September 30, the U.S. District Court for the Northern District of California certified a stipulation and proposed order regarding a permanent injunction and dismissal to abandon remaining allegations against an Israel-based company and a Delaware company (collectively, defendants) related to their use of data scraping from the parent company of large social media platforms (plaintiff). In 2020, the plaintiff alleged that the defendants developed and distributed internet browser extensions to illegally scrape data from the plaintiff’s platform and other platforms. The order noted that the court’s prior summary judgment decision concluded that the defendants collected data using “self-compromised” accounts of users who had downloaded the defendants’ browser extensions. The order further noted that the defendants stipulated that the plaintiff had established that it suffered “irreparable injury” and incurred a loss of at least $5,000 in a one-year period as a result of one of the companies’ unauthorized access. The order further noted that judgment has been established “based on [the Israel-based company’s] active data collection through legacy user products beginning October 2020, and based on [the Israel-based company’s] direct access to password-protected pages on [the plaintiff’s] platforms using fake or purchased user accounts.” Under the injunction, the defendants are immediately and permanently barred from accessing or using two of the plaintiff’s social media platforms without the plaintiff’s express written permission, regardless of whether the companies are using the platforms directly or via a third party. The defendants are also banned from collecting data or assisting others collect data without the plaintiff’s permission, and are required to delete any and all software, scripts or code that are designed to access or interact with two of the plaintiff’s social media platforms. Additionally, the defendants are prohibited from using or selling any data that they have previously collected from the plaintiff’s social media platforms.

    Courts Privacy, Cyber Risk & Data Security Data Scraping Social Media Data Collection / Aggregation

  • U.S.-UK Data Access Agreement now in effect

    Privacy, Cyber Risk & Data Security

    On October 3, the DOJ announced that the U.S.-UK Data Access Agreement (Agreement) is now in effect. According to the DOJ, the Agreement, authorized by the Clarifying Lawful Overseas Use of Data (CLOUD) Act, is the first of its kind and will allow investigators from each country to gain better access to vital data to combat serious crime in a manner “consistent with privacy and civil liberties standards.” Under the Agreement, “service providers in one country may respond to qualifying, lawful orders for electronic data issued by the other country, without fear of running afoul of restrictions on cross-border disclosures,” the DOJ said. The Agreement is intended to foster “more timely and efficient access to electronic data required in fast-moving investigations through the use of orders covered by the Agreement,” and will greatly improve the two countries’ ability “to prevent, detect, investigate, and prosecute serious crime, including terrorism, transnational organized crime, and child exploitation, among others.” U.S. and UK officials are required to meet numerous requirements in order to invoke the Agreement, such as orders must relate to a serious crime and may not target persons located in the country for which the order is submitted. Authorities in both countries must also follow agreed upon requirements, limitations, and conditions when obtaining and using data obtained under the Agreement. The DOJ’s Office of International Affairs has been selected as the designated authority responsible for implementing the Agreement in the U.S. and “has created a CLOUD team to review and certify orders that comply with the Agreement on behalf of federal, state, local, and territorial authorities located in the United States, transmit certified orders directly to UK service providers, and arrange for the return of responsive data to the requesting authorities.”

    Privacy, Cyber Risk & Data Security DOJ UK Of Interest to Non-US Persons Investigations

  • Democrats urge FTC to update COPPA

    Privacy, Cyber Risk & Data Security

    On September 29, Senator Edward J. Markey (D-MA), along with three other Congressional Democrats, sent a letter to FTC Chair Lina Khan requesting that the Commission update its regulations under the Children’s Online Privacy Protection Act (COPPA). The Senators encouraged the FTC to use its regulatory authority to update COPPA to implement additional protections addressing online threats to children as their use of technology increases. They laid out several areas for the FTC’s consideration, including (i) “expanding the definition of ‘personal information’ covered under COPPA”; (ii) “implementing rules to effectuate COPPA’s prohibition on conditioning a child’s participation in an online activity on the child sharing more data than is reasonably necessary”; (iii) “implementing rules to effectuate COPPA’s requirement that platforms protect the confidentiality, security, and integrity of children’s data”; (iv) “ensuring that COPPA’s requirements protect children on the platforms they actually use by updating COPPA’s regulations defining platforms that are directed to children and updating regulations defining platforms that have actual knowledge they are collecting data from children”; (v) “implementing regulatory protections that reflect the increased use of online platforms for educational purposes”; and “(vi) implementing regulatory protections that reflect changes in online advertising practices.”

    The Senators also applauded the FTC’s recently issued advanced notice of proposed rulemaking requesting feedback on questions related to a wide range of concerns about commercial surveillance practices (covered by InfoBytes here), including those involving children and teens, and advised the Commission to closely review and consider expert responses when crafting its rules aimed at the protection of children’s privacy.

    Privacy, Cyber Risk & Data Security Agency Rule-Making & Guidance FTC Federal Issues COPPA Consumer Protection

  • FCC proposes rulemaking to combat unlawful text messages

    Agency Rule-Making & Guidance

    On September 27, the FCC announced a notice of proposed rulemaking (NPRM) to target and eliminate unlawful text messages. According to the FCC, the number of consumer complaints received related to unwanted text messages has increased by 146 percent between 2019 and 2020, and continues to grow in 2022. The Commission warns that these text messages present harms beyond that of unwanted phone calls, as text messages can include phishing and malware links. More than $86 million was stolen in 2020 through spam texting fraud schemes, the FCC reports. The NPRM seeks feedback on several topics, including whether providers should follow the STIR/SHAKEN authentication protocols for text messages as they do for phone calls, whether providers should block texts from invalid phone numbers, and how it can ensure that emergency text messages or other appropriate texts are not erroneously blocked. The NPRM also proposes requiring providers to block texts that appear to originate from phone numbers that are invalid, unallocated, or unused as well as numbers on the “Do-Not-Originate” list.

    The Commission is also seeking input on the extent to which spoofing is a problem in texting, and if caller ID authentication standards should be applied to texting. Spoofing is when a sender deliberately disguises their number to trick a recipient into thinking the message is trustworthy. A working group of the Internet Engineering Task Force is currently considering a draft standard that would apply parts of the STIR/SHAKEN framework to text messages, the FCC stated, adding that it is asking stakeholders for suggestions on an ideal timeline and feedback on whether the current framework’s governance system would be able to accommodate authentication for text messages or if the framework would require more comprehensive technology network upgrades.

    Comments on the NPRM are due 30 days after publication in the Federal Register.

    Agency Rule-Making & Guidance Privacy, Cyber Risk & Data Security FCC Text Messages

  • CISA urges companies to take action to combat malicious cyber activity

    Privacy, Cyber Risk & Data Security

    On September 14, the Cybersecurity and Infrastructure Security Agency, along with several other federal agencies and international partners, released a joint cybersecurity advisory (CSA) highlighting continued malicious cyber activity taken by advanced persistent threat actors affiliated with the Iranian Government’s Islamic Revolutionary Guard Corps (IRGC). The CSA recommended that companies continually test their security programs to protect against longstanding online threats that may arise from IRGC-affiliated actors known for exploiting vulnerabilities for ransom operations. “Our unified purpose is to drive timely and prioritized adoption of mitigations and controls that are most effective to reducing risk to all cyber threats,” CISA said in its announcement. Under Secretary of the Treasury for Terrorism and Financial Intelligence Brian E. Nelson added that the U.S. Treasury Department “is dedicated to collaborating with other U.S. government agencies, allies, and partners to combat and deter malicious cyber-enabled actors and their activities, especially ransomware and cybercrime that targets economic infrastructure.” He noted that the CSA provides information on specific tactics, techniques, and procedures used by IRGC-affiliated actors, and advised both the public and private sector to use the information to strengthen cybersecurity resilience and reduce the risk of ransomware incidents. Organizations are encouraged to review a 2021 Treasury advisory, which highlights the sanctions risks associated with ransomware payments and provides steps for companies to take to mitigate the risk of being a victim of ransomware (covered by InfoBytes here).

    Privacy, Cyber Risk & Data Security Financial Crimes Iran CISA Of Interest to Non-US Persons Ransomware

  • California adopts “first-in-nation” act to safeguard children’s online data and privacy

    Privacy, Cyber Risk & Data Security

    On September 15, the California governor signed into law the California Age-Appropriate Design Code Act (the Act), calling it the “first-in-nation” bill to protect children’s online data and privacy. AB 2273 establishes new legal requirements for businesses that provide online products and services that are “likely to be accessed by children” under 18 years of age based on certain factors. These factors include whether the feature is: (i) “directed to children,” as defined by the Children’s Online Privacy Protection Act (COPPA); (ii) “determined, based on competent and reliable evidence regarding audience composition, to be routinely accessed by a significant number of children”; (iii) advertised to children; (iv) is substantially similar to, or the same as, an online service, product, or feature routinely accessed by a significant number of children; (v) designed to appeal to children; or (vi) determined to be, based on internal company research, significantly accessed by children. Notably, in contrast to COPPA, the Act more broadly defines “child” as a consumer who is under the age of 18 (COPPA defines “child” as an individual under 13 years of age).

    The Act also outlines specific requirements for covered businesses, including:

    • Businesses must configure all default privacy settings offered by the online service, product, or feature to one that offers a high level of privacy, “unless the business can demonstrate a compelling reason that a different setting is in the best interests of children”;
    • Businesses must “concisely” and “prominently” provide clear privacy information, terms of service, policies, and community standards suited to the age of the children likely to access the online service, product, or feature;
    • Prior to offering any new online services, products, or features that are likely to be accessed by children before July 1, 2024, businesses must complete a Data Protection Impact Assessment (DPIA) on or before the same date. Businesses must also document any “risk of material detriment to children” that arises from the DPIA, create a mitigation plan, and, upon written request, provide the DPIA to the state attorney general;
    • Businesses must “[e]stimate the age of child users with a reasonable level of certainty appropriate to the risks that arise from the data management practices of the business or apply the privacy and data protections afforded to children to all consumers”;
    • Should an online service, product, or feature allow a child’s parent, guardian, or any other consumer to monitor the child’s online activity or track the child’s location, businesses must provide an obvious signal to the child when the child is being monitored or tracked;
    • Businesses must “[e]nforce published terms, policies and community standards established by the business, including, but not limited to, privacy policies and those concerning children”; and
    • Businesses must provide prominent, accessible, and responsive tools to help children (or their parents/guardians) exercise their privacy rights and report concerns.

    Additionally, covered businesses are prohibited from using a child’s personal information (i) in a way that the business knows, or has reason to know, is materially detrimental to a child’s physical health, mental health, or well-being; or (ii) for any reason other than a reason for which the personal information was collected, unless a business can show a compelling reason that using the personal information is in the “best interests of children.” The Act also places restrictions on profiling, collecting, selling, or sharing children’s geolocation data, or using dark patterns to encourage children to provide personal information beyond what is reasonably expected.

    The Act also establishes the California Children’s Data Protection Working Group, which will study and report to the legislature best practices for implementing the Act, and will also, among other things, evaluate ways to leverage the expertise of the California Privacy Protection Agency in the long-term development of data privacy policies that affect the privacy, rights, and safety of children online. The state attorney general is tasked with enforcing the Act and may seek an injunction or civil penalty against any business that violates its provisions. Violators may be subject to a penalty of up to $2,500 per affected child for each negligent violation, and up to $7,500 per affected child for each intentional violation; however, businesses may be provided a 90-day cure period if they have achieved “substantial compliance” with the Act’s assessment and mitigation requirements.

    The Act takes effect July 1, 2024.

    Privacy, Cyber Risk & Data Security State Issues State Legislation Consumer Protection California COPPA CPPA State Attorney General Enforcement

Pages

Upcoming Events