Skip to main content
Menu Icon
Close

InfoBytes Blog

Financial Services Law Insights and Observations

Filter

Subscribe to our InfoBytes Blog weekly newsletter and other publications for news affecting the financial services industry.

  • New York considers privacy legislation broader than the CCPA

    Privacy, Cyber Risk & Data Security

    On November 22, the New York Senate’s Committee on Consumer Protection and Committee on Internet and Technology held a joint hearing titled, “Consumer Data and Privacy on Online Platforms,” which discussed the proposed New York Privacy Act, SB S5642 (the Act). The Act was introduced in May and seeks to regulate the storage, use, disclosure, and sale of consumer personal data by entities that conduct business in New York State or produce products or services that are intentionally targeted to residents of New York State. The Act contains different provisions than the California Consumer Privacy Act (CCPA), which is set to take effect on January 1, 2020 (visit here for InfoBytes coverage on the CCPA). Highlights of the Act include:

    • Fiduciary Duty. Most notably, the Act requires that legal entities “shall act in the best interests of the consumer, without regard to the interests of the entity, controller or data broker, in a manner expected by a reasonable consumer under the circumstances.” Specifically, the Act states that personal data of consumers “shall not be used, processed or transferred to a third party, unless the consumer provides express and documented consent.” The Act imposes a duty of care on every legal entity, or affiliate of a legal entity, with respect to securing consumer personal data against privacy risk and requires prompt disclosure of any unauthorized access. Moreover, the Act requires that legal entities enter into a contract with third parties imposing the same duty of care for consumer personal data prior disclosing, selling, or sharing the data with that party.
    • Consumer Rights. The Act requires covered entities to provide consumers notice of their rights under the Act and provide consumers with the opportunity to opt-in or opt-out of the “processing of their personal data” using a method where the consumer must clearly select and indicate their consent or denial. Upon request, and without undue delay, covered entities are required to correct inaccurate personal data or delete personal data.
    • Transparency. The Act requires covered entities to make a “clear, meaningful privacy notice” that is “in a form that is reasonably accessible to consumers,” which should include: the categories of personal data to be collected; the purpose for which the data is used and disclosed to third parties; the rights of the consumer under the Act; the categories of data shared with third parties; and the names of third parties with whom the entity shares data. If the entity sells personal data or processes data for direct marketing purposes, it must disclose the processing, as well as the manner in which a consumer may object to the processing.
    • Enforcement. The Act defines violations as an unfair or deceptive act in trade or commerce, as well as, an unfair method of competition. The Act allows for the attorney general to bring an action for violations and also prescribes a private right of action on any harmed individual. Covered entities are subject to injunction and liable for damages and civil penalties.

    According to reports, state lawmakers at the November hearing indicated that federal requirements would be “the best scenario,” but in the absence of Congressional movement in the area, one state senator noted that the state legislators must “assure [their] constituents that [the state legislature is] doing everything possible to protect their privacy.” Witnesses expressed concern that the Act would be placing too many new requirements on businesses that differ from what other states have already enacted, and encouraged more consistent baseline standards for compliance instead of a patchwork approach. Some witnesses expressed specific concern with the opt-in requirement for the collection and use of consumer data, noting that waiting on consumers to opt-in, as opposed to just opting-out, makes compliance difficult to administer. Lastly, many witnesses were displeased about the broad private right of action in the Act, but consumer groups praised the provision, noting that the state attorney general does not have the resources to regulate and enforce against all the data collection and sharing in the state.

    Privacy/Cyber Risk & Data Security State Legislation State Issues Enforcement State Attorney General

  • Buckley Insights: Leveraging open source intelligence for cyber threat modeling

    Privacy, Cyber Risk & Data Security

    The FTC Safeguards Rule, FFIEC Cybersecurity and IT Guidance, and other OCC guidelines (here and here) emphasize the need for cyber threat intelligence (CIT) and threat identification to inform an organization’s overall cyber risk identification, assessment, and mitigation program. Indeed, to successfully implement a risk-based information security program, an organization must be aware of both general cybersecurity risks across all industries, as well as both business-sector risks and organizational risks unique to the organization. Furthermore, proposed revisions to the FTC Safeguards Rule (previously covered by InfoBytes here) emphasize the need for a “through and complete risk assessment” that is informed by “possible vectors through which the security, confidentiality, and integrity of that information could be threatened.”

    Threat modeling is generally understood as a formal process by which an organization identifies specific cyber threats to an organization’s information systems and sensitive information, which provides the management insight regarding the defenses needed; the critical risk areas within and across an information system, network, or business process; and the best allocation of scarce resources to address the critical risks. Even today, generally an accepted threat modeling process involves comprehensive system, application, and network mapping and data flow diagrams. Many threat modeling tools are available free to the public, such as Microsoft’s Threat Modeling Tool, which provides diagramming and analytical resources for network and data flow diagrams, utilizing the STRIDE model (spoofing, tampering, repudiation, information disclosure, denial of service, and escalation of privilege) to inform the user of general cyber-attack vectors that each organization should consider. Generally, between cybersecurity frameworks, such as the NIST Cybersecurity Framework (for risk-based analytical approaches), and threat modeling tools identifying generic cyber threats such as STRIDE (for general or sector-specific cyber risks), an organization can achieve a risk-informed information security program.

    However, with the increasing amount of large-scale data breaches occurring and with the evolving complexity of cybersecurity threats, many regulatory agencies and other industry-based standards institutions have called for a need to go one step further and understand the techniques, tactics, and procedures (TTPs) utilized by hackers using CIT. By using CIT and other threat-based models, organizations can gain insight into potential attack vectors through red-teaming and penetration testing by simulating each phase of a hypothetical attack into the organization’s information system and determine potential countermeasures that can be employed at each step of the kill chain. For instance, Lockheed Martin’s formal kill chain model involves seven steps (reconnaissance, weaponization, delivery, exploitation, installation, command and control, and actions on objective) and proposes six potential defensive measures at each step (detect, deny, disrupt, degrade, deceive, and contain). Consequently, an organization can layer its defenses along each step in the kill chain to increase the probability of detection or prevention of the attack. Kill Chain was used as part of a U.S. Senate investigation into the data breach of a major corporation in 2013, identifying several stages along the chain where the attack could have been prevented or detected.

    This threat identification process requires greater detail on adversarial TTPs. Fortunately, MITRE has provided for public consumption its ATT&CK (adversarial tactics, techniques, and common knowledge) platform. ATT&CK collects and streamlines adversarial TTPs in specific detail and provides information on each technique and potential mitigating procedures, including commonly used attack patterns for each. For instance, one tactic identified by ATT&CK is to encrypt data being exfiltrated to avoid detection by data loss prevention (DLP) tools or other network anomaly detection tools and identifies more than forty known techniques and tools that have been used to achieve encrypted transmission. ATT&CK also identifies potential detection and mitigation options, such as scanning unencrypted channels for encrypted files using DLP or intrusion detection software. Thus, instead of a generic data breach risk analysis, organizations can understand specific TTPs that may make data breach detection and analysis more difficult, and possibly take measures to prevent it.

    By leveraging open-source CIT from tools such as ATT&CK and other reports from third-party sources such as government and industry alerts, organizations can begin the process of designing proactive defenses against cyber threats. It is important to note, however, that ATT&CK can only inform an organization’s threat modeling, and is not a threat model itself; additionally, ATT&CK focuses on penetration and hacking TTPs and, therefore, does not examine other threats that organizations may face, including distributed denial of services (DDoS) attacks that threaten the availability of its systems. Such threats will still need to be accounted for in any financial organization’s risk assessment, particularly if such DDoS prevent its clients from accessing their financial accounts and ultimately, their money.

    Privacy/Cyber Risk & Data Security Data Breach FTC OCC FFIEC

  • U.K. ICO and social media company settle privacy investigation

    Privacy, Cyber Risk & Data Security

    On October 30, the U.K. Information Commissioner’s Office (ICO) announced an agreement reached between the ICO and a social media company that resolves an investigation into the company’s alleged misuse of personal data. The company has agreed to withdraw its appeal of the £500,000 penalty issued last year under section 55A of the Data Protection Act 1998 (DPA) and settle the case without an admission of guilt. The investigation stems from a data incident affecting upwards of 87 million users worldwide that included the processing of personal data about U.K. users in the context of a U.K. establishment. According to the ICO, the company violated principles of the DPA by (i) unfairly processing personal data; and (ii) failing “to take appropriate technical and organi[z]ational measures against unauthori[z]ed or unlawful processing of personal data.” The ICO published a statement by the company’s associate general counsel in which he noted that the company has “made major changes” to its platform that significantly restricts the information accessible to app developers, and that “[p]rotecting people’s information and privacy is a top priority for [the company].”

     

    Privacy/Cyber Risk & Data Security Information Commissioner's Office U.K. Of Interest to Non-US Persons Settlement

  • NIST publishes updated Big Data Interoperability Framework

    Privacy, Cyber Risk & Data Security

    On October 21, the National Institute for Standards and Technology (NIST) released the second revision of its Big Data Interoperability Framework (NBDIF), which aims to “develop consensus on important, fundamental concepts related to Big Data” with the understanding that Big Data systems have the potential to “overwhelm traditional technical approaches,” to include traditional approaches regarding privacy and data security. Modest updates were made to Volume 4 of the NBDIF, which focuses on privacy and data security, including recommending a layered approach to Big Data system transparency. With respect to transparency, Volume 4 introduces three levels, starting from level 1, which involves a System Communicator that “provides online explanations to users or stakeholders” discussing how information is processed and retained in a Big Data system, as well as records of “what has been disclosed, accepted, or rejected.” And at the most mature levels, transparency includes developing digital ontologies (multi-level architecture for digital data management) across domain-specific Big Data systems to enable adaptable privacy and security configurations based on user characteristics and populations. Largely intact, however, are the Big Data Safety Levels, in Appendix A which are voluntary (standalone) standards regarding best practices for privacy and data security in Big Data systems, and include application security, business continuity, and transparency aspects.

    Privacy/Cyber Risk & Data Security Big Data NIST

  • Special Alert: California attorney general releases proposed CCPA regulations

    Privacy, Cyber Risk & Data Security

    Buckley Special Alert

    Last week, the California attorney general released the highly anticipated proposed regulations implementing the California Consumer Privacy Act (CCPA). The CCPA — which was enacted in June 2018 (covered by a Buckley Special Alert), amended several times and with the most recent amendments signed into law on Oct. 11, and is currently set to take effect on Jan. 1, 2020 — directed the California attorney general to issue regulations to further the law’s purpose.

    * * *

    Click here to read the full special alert.

    If you have any questions about the CCPA or other related issues, please visit our Privacy, Cyber Risk & Data Security practice page, or contact a Buckley attorney with whom you have worked in the past.

    Privacy/Cyber Risk & Data Security State Issues CCPA State Attorney General State Regulators Special Alerts Of Interest to Non-US Persons CCPA/EU

  • California attorney general releases proposed CCPA regulations

    Privacy, Cyber Risk & Data Security

    On October 10, the California attorney general released the highly anticipated proposed regulations implementing the California Consumer Privacy Act (CCPA). The CCPA—which was enacted in June 2018 (covered by a Buckley Special Alert), amended in September 2018, amended again in October 2019 (pending Governor Gavin Newsom’s signature), and is currently set to take effect on January 1, 2020 (Infobytes coverage on the amendments available here and here)—directed the California attorney general to issue regulations to further the law’s purpose. The proposed regulations address a variety of topics related to the law, including:

    • How a business should provide disclosures required by the CCPA, such as the notice at collection of personal information, the notice of financial incentive, the privacy policy, and the opt-out notice;
    • The handling of consumer requests made under the CCPA, such as requests to know, requests to delete, and requests to opt-out;
    • Service provider classification and obligations;
    • The process for verifying consumer requests;
    • Training and recordkeeping requirements; and
    • Special requirements related to minors.

    The California attorney general will hold four public hearings between December 2 and December 5 on the proposed regulations. Written comments are due by December 6.

    Notably, the Notice of Proposed Rulemaking states that “the adoption of these regulations may have a significant, statewide adverse economic impact directly affecting business, including the ability of California businesses to compete with businesses in other states” and requests that the public consider, among other things, different compliance requirements depending on a business’s resources or potential exemptions from the regulatory requirements for businesses when submitting comments on the proposal.   

    Buckley will follow up with a more detailed summary of the proposed regulations soon.

    Privacy/Cyber Risk & Data Security State Issues State Attorney General CCPA State Legislation Agency Rule-Making & Guidance

  • Pre-checked box does not give consent to cookies under EU privacy directive and GDPR

    Privacy, Cyber Risk & Data Security

    On October 1, the European Court of Justice held that, under the Privacy and Electronic Communications Directive (ePrivacy Directive), a website user does not “consent” to the use of a cookie when a website provides a “pre-checked box” that needs to be deselected for a user to withdraw consent. According to the judgment, a consumer group brought an action in German court against a German lottery company, challenging the website’s use of a pre-checked box allowing the website to place a cookie—text files stored on the user’s computer allowing website providers to collect information about a user’s behavior when the user visits the website—unless the consumer deselected the box. The consumer group argued that the pre-selection of the box is not valid consent under the ePrivacy Directive. The lower court had upheld the action in part, but, following an appeal, the German Federal Court of Justice stayed the proceedings and referred the matter to the EU Court of Justice.

    The Court agreed with the consumer group, concluding that the practice violated the law by not requiring users to give active, express consent to the use of the cookies. Specifically, the Court noted that the 2009 amendments to Article 5(3) of the ePrivacy Directive, which requires the website user to give “his or her consent, having been providing with clear and comprehensive information,” must be interpreted literally “to which action is required on the part of the user in order to give his or her consent.” Because the box allowing the use of cookies was checked by default, “[i]t is not inconceivable that a user would not have read the information accompanying the preselected checkbox, or even would not have noticed that checkbox, before continuing with his or her activity on the website visited,” and therefore, it would “appear impossible” to determine whether a user gave consent to the cookies by not “deselecting a pre-ticked checkbox nor, in any event, whether that consent had been informed.” The Court noted that “[a]ctive consent is thus now expressly laid down in [the EU General Data Protection Regulation (GDPR)],” and that it “expressly precludes ‘silence, pre-ticked boxes or inactivity’ from constituting consent.’” Moreover, the Court held the ePrivacy Directive also requires that, among other information, “the service provider must [disclose] to a website user . . . the duration of the operations of cookies and whether or not third parties may have access to those cookies” to give effect to “clear and comprehensive information.”

    Privacy/Cyber Risk & Data Security European Union Consent Of Interest to Non-US Persons

  • Ballot initiative seeks to expand CCPA, create new enforcement agency

    Privacy, Cyber Risk & Data Security

    On September 25, Alastair Mactaggart, the Founder and Chair of the Californians for Consumer Privacy and the drafter of the initiative that ultimately resulted in the California Consumer Privacy Act (CCPA), announced a newly filed ballot measure to further expand the CCPA (currently effective on January 1, 2020), titled the “California Privacy Rights and Enforcement Act of 2020” (the Act) (an additional version of the Act is available with comments from McTaggart’s team). The Act would result in significant amendments to the CCPA, including the following, among others

    • Sensitive personal information. The Act sets forth additional obligations in connection with a business’s collection, use, sale, or disclosure of “sensitive personal information,” which is a new term introduced by the Act. “Sensitive personal information” includes categories such as health information; financial information (stated as, “a consumer’s account log-in, financial account, debit card, or credit card number in combination with any required security or access code, password, or credentials allowing access to an account”); racial or ethnic origin; precise geolocation; or other data collected and analyzed for the purpose of identifying such information.
    • Disclosure of sensitive personal information. The Act expands on the CCPA’s disclosure requirements to include, among other things, a requirement for businesses to specify the categories of sensitive personal information that will be collected, disclose the specific purposes for which the categories of sensitive personal information are collected or used, and disclose whether such information is sold. In addition, the Act prohibits a business from collecting additional categories of sensitive personal information or use sensitive personal information collected for purposes that are incompatible with the disclosed purpose for which the information was collected, or other disclosed purposes reasonably related to the original purpose for which the information was collected, unless notice is provided to the consumer.
    • Contractual requirements. The Act sets forth additional contractual requirements and obligations that apply when a business sells personal information to a third party or discloses personal information to a service provider or contractor for a business purpose. Among other things, the Act obligates the third party, service provider, or contractor to provide at least the same level of privacy protection required by the Act. The contract must also require the third party, service provider, or contractor to notify the business if it makes a determination that it can no longer meet its obligation to protect the personal information as required by the Act.
    • Eligibility for financial or lending services. The Act would require a business that collects personal information to disclose whether the business is profiling consumers and using their personal information for purposes of determining eligibility for, among other things, financial or lending services, housing, and insurance, as well as “meaningful information about the logic involved in using consumers’ personal information for this purpose.” Additionally, the business appears required to state in its privacy policy notice if such profiling had, or could reasonably have been expected to have, a significant, adverse effect on the consumers with respect to financial lending and loans, insurance, or any other specific categories that are enumerated. Notably, while Mactaggart has expressed heightened concern with sensitive personal information, such as health and financial information, the Act appears to retain the CCPA’s current exemptions under the Fair Credit Reporting Act and the Gramm-Leach-Bliley Act.
    • Advertising and marketing opt-out. The Act includes a consumer’s right to opt-out, at any time, of the business’s use of their sensitive personal information for advertising and marketing or disclosure of personal information to a service provider or contractor for the same purposes. The Act requires that businesses provide notice to consumers that their sensitive personal information may be used or disclosed for advertising or marketing purposes and that the consumers have “the right to opt-out” of its use or disclosure. “Advertising and marketing” means a communication by a business or a person acting on the business’s behalf in any medium intended to induce a consumer to buy, rent, lease, join, use, subscribe to, apply for, provide, or exchange products, goods, property, information, services, or employment.
    • Affirmative consent for sale of sensitive personal information. The Act expands on the CCPA’s opt-out provisions and prohibits businesses from selling a consumer’s sensitive personal information without actual affirmative authorization.
    • Right to correct inaccurate information. The Act provides consumers with the right to require a business to correct inaccurate personal information.
    • Definition of business.  The Act revises the definition of “business” to:
      • Clarify that the time period for calculating annual gross revenues is based on the prior calendar year; 
      • Provide that an entity meets the definition of “business” if the entity, in relevant part, alone or in combination, annually buys the personal information of 100,000 or more consumers or households;
      • Include a joint venture or partnership composed of business in which each business has at least a 40% interest; and
      • Provides a catch-all for businesses not covered by the foregoing bullets.
    • The “California Privacy Protection Agency.” The Act creates the California Privacy Protection Agency, which would have the power, authority, and jurisdiction to implement and enforce the CCPA (powers that are currently vested in the attorney general). The Act states that the Agency would have five members, including a single Chair, and the members would be appointed by the governor, the attorney general, and the leaders of the senate and assembly.

    If passed, the Act would become operative on January 1, 2021 and would apply to personal information collected by a business on or after January 1, 2020.

    As previously covered by a Buckley Special Alert, on September 13, lawmakers in California passed numerous amendments to the CCPA, which are awaiting Governor Gavin Newsom’s signature, who has until October 13 to sign. The amendments leave the majority of the consumer’s rights intact, but certain provisions were clarified — including the definition of “personal information” — while other exemptions were clarified regarding the collection of certain data that have a bearing on financial services companies.

     

     

    Privacy/Cyber Risk & Data Security State Issues State Legislation State Attorney General CCPA

  • NIST requests comments on draft privacy framework

    Privacy, Cyber Risk & Data Security

    On September 6, the National Institute of Standards and Technology (NIST) released a preliminary draft of the NIST Privacy Framework: A Tool for Improving Privacy through Enterprise Risk Management to help organizations assess and reduce risks. The draft framework is designed to align with NIST’s Cybersecurity Framework (previously covered by InfoBytes here), which provides guidance that critical infrastructures, including the financial services industry, should voluntarily follow to mitigate cybersecurity risk. The draft framework establishes three components to reinforce privacy risk management: (i) the “Core” describes a set of privacy activities and outcomes used to manage risks that arise from data processing or are associated with privacy breaches; (ii) “Profiles” cover an organization’s current privacy activities or desired outcomes that have been prioritized to manage privacy risk; and (iii) “Implementation Tiers” address how organizations see privacy risk, and whether they have sufficient processes and resources in place to manage that risk. According to NIST, “Finding ways to continue to derive benefits from data while simultaneously protecting individuals’ privacy is challenging, and not well-suited to one-size-fits-all solutions.” Public comments will be accepted through October 24.

    Privacy/Cyber Risk & Data Security NIST

  • Democratic members ask FSOC to deem cloud providers as "systemically important"

    Privacy, Cyber Risk & Data Security

    On August 22, two members of the U.S. House of Representatives, Katie Porter (D-Calif.) and Nydia Velázquez (D-N.Y.), sent a letter to the U.S. Department of Treasury requesting that the Financial Stability Oversight Council (FSOC) consider designating the three leading providers of cloud-based storage systems for the financial industry as systemically important financial market utilities. The letter is in response to the recent data breach announcement by a national bank (covered by InfoBytes here), where an alleged former employee of the bank’s cloud-based storage system gained unauthorized access to the personal information of credit card customers and people who had applied for credit card products. According to the Congresswomen, 57 percent of the cloud services market is “cornered by” three main providers, and “a lack of substitutability for the services provided by these very few firms creates systemic risk.” The letter argues that cloud services are not currently subject to an enforced regulatory regime and, “[w]ithout a dedicated regulatory regime proportional and tailored to their very unique structure and risks, cloud comparing companies will continue to evade supervision.”

    Privacy/Cyber Risk & Data Security Data Breach Credit Cards FSOC Congress

Pages

Upcoming Events