Skip to main content
Menu Icon
Close

InfoBytes Blog

Financial Services Law Insights and Observations

Filter

Subscribe to our InfoBytes Blog weekly newsletter and other publications for news affecting the financial services industry.

  • EU Commission, Council, and Parliament agree on details of AI Act

    Privacy, Cyber Risk & Data Security

    On December 9, the EU Commission announced a political agreement between the European Parliament and the European Council regarding the proposed Artificial Intelligence Act (AI Act).  The agreement is provisional and is subject to finalizing the text and formal approval by lawmakers in the European Parliament and the Council. The AI Act will regulate the development and use of AI systems, as well as impose fines on any non-compliant use. The object of the law is to ensure that AI technology is safe and that its use respects fundamental democratic rights while balancing the need to allow businesses to grow and thrive. The AI Act will also create a new European AI Office to ensure coordination, transparency, and to “supervise the implementation and enforcement of the new rules.” According to this EU Parliament press release, powerful foundation models that pose systemic risks will be subject to specific rules in the final version of the AI Act based on a tiered classification.

    Except with foundation models, the EU AI Act adopts a risk-based approach to the regulation of AI systems, classifying these into different risk categories: minimal risk, high-risk, and unacceptable risk. Most AI systems would be deemed as minimal risk since they pose little to no risk to citizens’ safety. High-risk AI systems would be subject to the heaviest obligations, including certifications on the adoption of risk-mitigation systems, data governance, logging of activity, documentation obligations, transparency requirements, human oversight, and cybersecurity standards.  Examples of high-risk AI systems include utility infrastructures, medical devices, institutional admissions, law enforcement, biometric identification and categorization, and emotion recognition systems. AI systems deemed “unacceptable” are those that “present a clear threat to the fundamental rights of people” such as systems that manipulate human behaviors, like “deep fakes,” and any type of social scoring done by governments or companies. While some biometric identification is allowed, “unacceptable” uses include emotional recognition systems at work or by law enforcement agencies (with narrow exceptions).

    Sanctions for breach of the law will range from a low of €7.5 million or 1.5 percent of a company’s global total revenue to as high as €35 million or 7 percent of revenue. Once adopted, the law will be effective from early 2026 or later. Compliance will be challenging (the law targets AI systems made available in the EU), and companies should identify whether their use and/or development of such systems will be impacted.

    Privacy, Cyber Risk & Data Security Privacy European Union Artificial Intelligence Privacy/Cyber Risk & Data Security Of Interest to Non-US Persons

  • EU court clarifies conditions for imposing GDPR fines

    Courts

    On December 5, the Court of Justice of the European Union (CJEU) issued a judgment clarifying the conditions under which a General Data Protection Regulation (GDPR) fine can be imposed on data controllers. The judgment is in response to two cases involving GDPR fines: (i) a German case in which a real estate company was fined for allegedly storing personal data for tenants for longer than necessary, and (ii) a Lithuanian case in which a government health center was fined in connection to the creation of an app that registered and tracked people exposed to Covid-19.

    In the judgment, the CJEU clarified that a data controller can only face an administrative fine under the GDPR for intentional or negligent violations—that is, violations for which a data controller was aware or should have been aware of “the infringing nature of its conduct,” regardless of their knowledge of the specific violation. The judgment also held that for a legal person, it is not necessary for the violation to be committed by its “management body,” nor does that body need to have knowledge of the specific violation. Instead, the legal person is accountable for violations committed by its representatives, directors, or managers, and those acting on their behalf within the business scope. Additionally, imposing an administrative fine on a legal entity as a data controller does not require prior identification of a specific person responsible for the violation.

    The judgment also addressed administrative fines for operations involving multiple entities. The CJEU noted that a controller may have a fine imposed upon it for actions undertaken by its processor. The court also clarified that a joint controller relationship arises from the two or more entities participating in determining the purpose and means for processing, and “does not require that there be a formal arrangement between the entities in question.”

    To calculate the amount of an administrative fine under the GDPR, the supervisory authority must consider the notion of an “undertaking” under competition law. The maximum fine must be based on the percentage of the total worldwide annual turnover of the particular undertaking in the preceding business year.

    Courts European Union GDPR Enforcement

  • FTC continues to enforce Privacy Shield

    Privacy, Cyber Risk & Data Security

    On August 5, the FTC Commissioners testified before the Senate Committee on Commerce, Science, and Transportation and discussed, among other things, the agency’s continued enforcement of the EU-U.S. Privacy Shield, despite the recent Court of Justice of the European Union (CJEU) invalidation of the framework, and their interest in federal data privacy legislation. As previously covered by InfoBytes, in July, the CJEU determined that because the requirements of U.S. national security, public interest and law enforcement have “primacy” over the data protection principles of the EU-U.S. Privacy Shield, the data transferred under the EU-U.S. Privacy Shield would not be subject to the same level of protections prescribed by the EU General Data Protection Regulation, and thus, declared the EU-U.S. Privacy Shield invalid.

    In his opening remarks, Commissioner Simons emphasized that the FTC will “continue to hold companies accountable for their privacy commitments, including privacy promises made under the Privacy Shield,” which the FTC has also noted on its website. Additionally, Simons urged Congress to enact federal privacy and data security legislation, that would be enforced by the FTC and give the agency, among other things, the “ability to seek civil penalties” and “targeted [Administrative Procedures Act] rulemaking authority to ensure that the law keeps pace with changes and technology in the market.” Moreover, Commissioner Wilson agreed with a senator’s proposition that the enactment of a preemptive federal privacy framework would make “achieving a future adequacy determination by the E.U. easier.”

    Privacy/Cyber Risk & Data Security FTC Courts GDPR European Union EU-US Privacy Shield

  • EU - U.S. forum studies implications of Covid-19 for financial stability

    Federal Issues

    On July 17, the U.S. Treasury Department issued a joint statement on the EU - U.S. Financial Regulatory Forum, which met virtually on July 14 and 15 and included participants from Treasury, the Federal Reserve Board, CFTC, FDIC, SEC, and OCC. Forum participants discussed six key themes: (i) potential financial stability implications and economic responses to the Covid-19 pandemic; (ii) capital market supervisory and regulatory cooperation, including cross-border supervision; (iii) “multilateral and bilateral engagement in banking and insurance,” including “cross-border resolution of systemic banks” and Volcker Rule implementation; (iv) approaches to anti-money laundering/countering the financing of terrorism financing and remittances; (v) the regulation and supervision of digital finance and financial innovation, such as “digital operational resilience and developments in crypto-assets, so-called stablecoins, and central bank digital currencies”; and (vi) sustainable finance developments. EU and U.S. participants recognized the importance of communicating mutual supervisory and regulatory concerns to “support financial stability, investor protection, market integrity, and a level playing field.”

    Federal Issues Regulation Of Interest to Non-US Persons Department of Treasury Federal Reserve CFTC FDIC SEC OCC Covid-19 European Union

  • Court of Justice of the European Union invalidates EU-U.S. Privacy Shield; standard contractual clauses survive (for now)

    Privacy, Cyber Risk & Data Security

    On July 16, 2020, the Court of Justice of the European Union (CJEU) issued its opinion in the Schrems II case (Case C-311/18). In its opinion, the CJEU concluded that the Standard Contractual Clauses issued by the European Commission for the transfer of personal data to data processors established outside of the EU are valid. However, the Court invalidated the EU-U.S. Privacy Shield. The ruling cannot be appealed.

    Background

    In 2015, a privacy campaigner named Max Schrems filed a complaint with Ireland’s Data Protection Commissioner challenging a global social media company’s use of data transfers from servers in Ireland to servicers in the U.S. Schrems argued that U.S. laws did not offer sufficient protection of EU customer data, that EU customer data might be at risk of being accessed and processed by the U.S. government once transferred, and that there was no remedy available to EU individuals to ensure protection of their personal data after transfer to the U.S. Schrems sought the suspension or prohibition of future data transfers, which were executed by the company through standard data protection contractual clauses (a method approved by the Court in 2010 by Decision 2010/87). The social media company had utilized these standard contractual clauses after the CJEU invalidated the U.S. – EU Safe Harbor Framework in 2015.

    Following the complaint, Ireland’s Data Protection Commissioner brought proceedings against the social media company in the Irish High Court, which referred numerous questions to the CJEU for a preliminary ruling, including questions addressing the validity of the standard contractual clauses and the EU-U.S. Privacy Shield.

    CJEU Opinion – Standard Contractual Clauses (Decision 2010/87)

    Upon review of the recommendations from the CJEU’s Advocate General published on December 19, 2019, the CJEU found the Decision approving the use of contractual clauses to transfer personal data valid.

    The CJEU noted that the GDPR applies to the transfer of personal data for commercial purposes by a company operating in an EU member state to another company outside of the EU, notwithstanding the third-party country’s processing of the data under its own security laws. Moreover, the CJEU explained that data protection contractual clauses between an EU company and a company operating in a third-party country must afford a level of protection “essentially equivalent to that which is guaranteed within the European Union” under the GDPR. According to the CJEU, the level of protection must take into consideration not only the contractual clauses executed by the companies, but the “relevant aspects of the legal system of that third country.”

    As for the Decision 2010/87, the CJEU determined that it provides effective mechanisms to, in practice, ensure contractual clauses governing data transfers are in compliance with the level of protection requirement by the GDPR, and appropriately requires the suspension or prohibition of transfers in the event the clauses are breached or unable to be honored. The CJEU specifically highlighted the certification required by the EU data exporter and the third-party country recipient to verify, prior to any transfer, (i) the level of data protection in the third-party country prior to any transfer; and (ii) abilities to comply with the data protection clauses.

    CJEU Opinion - EU-U.S. Privacy Shield, (Decision 2016/1250)

    The CJEU decided to examine and rule on the validity of the EU – U.S. Privacy Shield. The CJEU determined that because the requirements of U.S. national security, public interest and law enforcement have “primacy” over the data protection principles of the EU-U.S. Privacy Shield, the data transferred under the EU-U.S. Privacy Shield would not be subject to the same level of protections prescribed by the GDPR. Specifically, the CJEU held that the surveillance programs used by U.S. authorities are not proportionally equivalent to those allowed under the EU law because they are not “limited to what is strictly necessary,” nor, under certain surveillance programs, does the U.S. “grant data subjects actionable rights before the courts against the U.S. authorities.” Moreover, the CJEU rejected the argument that the Ombudsperson mechanism satisfies the GDPR’s right to judicial protection, stating that it “does not provide any cause of action before a body which offers the persons whose data is transferred to the United States guarantees essentially equivalent to those required by [the GDPR],” and the Ombudsperson “cannot be regarded as a tribunal.” Thus, on those grounds, the CJEU declared the EU-U.S. Privacy Shield invalid.

    Privacy/Cyber Risk & Data Security GDPR European Union Of Interest to Non-US Persons EU-US Privacy Shield

  • U.S., EU discuss financial regulatory developments

    Federal Issues

    On February 19, the U.S. Treasury Department issued a joint statement on the U.S. – EU Financial Regulatory Forum held February 11-12 in Washington, D.C. U.S. participants included officials from the Federal Reserve Board, CFTC, FDIC, SEC, OCC, and Treasury. Forum topics focused on five key themes: “(1) supervision and regulation of cross-border activities, particularly in the areas of derivatives and central clearing; (2) the importance of monitoring market developments, both in relation to financial assets classes, like leveraged loans and collateralized loan obligations, and reference rates, like the London Interbank Offered Rate; (3) implementation of international standards in banking and insurance; (4) regulatory issues presented by fintech/digital finance; and (5) EU regulations related to sustainable finance.”

    Among other topics, participants discussed U.S. banking developments concerning prudential requirements for foreign banks, including tailoring standards based on risk; proposed amendments to the Volcker Rule; EU data protection rules; cross-border supervision and data flow in financial services; the transition period following the U.K.’s departure from the EU; and European Commission priorities such as preventing and combating money laundering and the financing of terrorism. Participants acknowledged the importance of fostering continued dialogue between the U.S. and the EU noting that, “[r]egular communication on supervisory and regulatory issues of mutual concern should foster financial stability, supervisory cooperation, investor protection, market integrity, and a level playing field.”

    Federal Issues Department of Treasury Federal Reserve CFTC FDIC SEC OCC European Union Of Interest to Non-US Persons LIBOR Fintech Anti-Money Laundering Combating the Financing of Terrorism

  • EU Court of Justice: Orders to remove defamatory content issued by member state courts can be applied worldwide

    Courts

    On October 3, the European Court of Justice held that a social media company can be ordered to remove, worldwide, defamatory content previously declared to be unlawful “irrespective of who required the storage of that information.” The decision results from a 2016 challenge brought by a former Austrian politician against the social media company’s Ireland-based operation—responsible for users located outside of the U.S. and Canada—to remove defamatory posts and comments made about her on a user’s personal page that was accessible to any user. The social media company disabled access to the content after an Austrian court issued an interim order, which found the posts to be “harmful to her reputation,” and ordered the social media company to cease and desist “publishing and/or disseminating photographs” showing the former politician “if the accompanying text contained the assertions, verbatim and/or [used] words having an equivalent meaning as that of the comment” originally at issue. On appeal, the higher regional court upheld the order but determined that “the dissemination of allegations of equivalent content had to cease only as regards [to] those brought to the knowledge of the [social media company] by the [former politician] in the main proceedings, by third parties or otherwise.”

    The Austrian Supreme Court of Justice requested that the EU Court of Justice adjudicate whether the cease and desist order may also be “extended to statements with identical wording and/or having equivalent content of which it is not aware” under Article 15(1) of Directive 2000/31 (commonly known as the “directive on electronic commerce”). Specifically, the EU Court of Justice considered (i) whether Directive 2000/31 generally precludes a host provider that has not “expeditiously removed illegal information”—including identically worded items of information—from removing content wordwide; (ii) if Directive 2000/31 does not preclude the host provider from its obligations, “does this also apply in each case for information with an equivalent meaning”; and (iii) does Directive 2000/31 also apply to “information with an equivalent meaning as soon as the operator has become aware of this circumstance.”

    According to the judgment, Directive 2000/31 “does not preclude those injunction measures from producing effects worldwide,” holding that a national court within the member states may order host providers to remove posts it finds defamatory or illegal. However, the judges concluded that such an order must function “within the framework of the relevant international law.”

    Courts European Union Privacy/Cyber Risk & Data Security

  • Pre-checked box does not give consent to cookies under EU privacy directive and GDPR

    Privacy, Cyber Risk & Data Security

    On October 1, the European Court of Justice held that, under the Privacy and Electronic Communications Directive (ePrivacy Directive), a website user does not “consent” to the use of a cookie when a website provides a “pre-checked box” that needs to be deselected for a user to withdraw consent. According to the judgment, a consumer group brought an action in German court against a German lottery company, challenging the website’s use of a pre-checked box allowing the website to place a cookie—text files stored on the user’s computer allowing website providers to collect information about a user’s behavior when the user visits the website—unless the consumer deselected the box. The consumer group argued that the pre-selection of the box is not valid consent under the ePrivacy Directive. The lower court had upheld the action in part, but, following an appeal, the German Federal Court of Justice stayed the proceedings and referred the matter to the EU Court of Justice.

    The Court agreed with the consumer group, concluding that the practice violated the law by not requiring users to give active, express consent to the use of the cookies. Specifically, the Court noted that the 2009 amendments to Article 5(3) of the ePrivacy Directive, which requires the website user to give “his or her consent, having been providing with clear and comprehensive information,” must be interpreted literally “to which action is required on the part of the user in order to give his or her consent.” Because the box allowing the use of cookies was checked by default, “[i]t is not inconceivable that a user would not have read the information accompanying the preselected checkbox, or even would not have noticed that checkbox, before continuing with his or her activity on the website visited,” and therefore, it would “appear impossible” to determine whether a user gave consent to the cookies by not “deselecting a pre-ticked checkbox nor, in any event, whether that consent had been informed.” The Court noted that “[a]ctive consent is thus now expressly laid down in [the EU General Data Protection Regulation (GDPR)],” and that it “expressly precludes ‘silence, pre-ticked boxes or inactivity’ from constituting consent.’” Moreover, the Court held the ePrivacy Directive also requires that, among other information, “the service provider must [disclose] to a website user . . . the duration of the operations of cookies and whether or not third parties may have access to those cookies” to give effect to “clear and comprehensive information.”

    Privacy/Cyber Risk & Data Security European Union Consent Of Interest to Non-US Persons

  • EU's “right to be forgotten” law applies only in EU

    Courts

    On September 24, the European Court of Justice held that Europe’s “right to be forgotten” online privacy law — which allows individuals to request the deletion of personal information from online sources that the individual believes infringes on their right to privacy—can be applied only in the European Union. The decision results from a challenge by a global search engine to a 2015 order by a French regulator, Commission Nationale de l'Informatique et des Libertés (CNIL), requiring the search engine to delist certain links from all of its global domains, not just domains originating from the European Union. The search engine refused to comply with the order, and the CNIL imposed a 100,000 EUR penalty. The search engine sought annulment of the order and penalty, arguing that the “right to be forgotten” does not “necessarily require that the links at issue are to be removed, without geographical limitation, from all its search engine’s domain names.” Moreover, the search engine asserted that the CNIL “disregarded the principles of courtesy and non-interference recognised by public international law” and infringed on the freedoms of expression, information, and communication.

    The Court of Justice agreed with the search engine. Specifically, the Court noted that while the “internet is a global network without borders” and internet users’ access outside of the EU to a referencing link to privacy infringing personal information is “likely to have immediate and substantial effects on that person within the Union itself,” there is no obligation under current EU law for a search engine to carry out the requested deletion on all global versions of its network. The Court explained that numerous nations do not recognize “the right to be forgotten” or take an alternate approach to the right. Additionally, the Court emphasized that “the right to the protection of personal data is not an absolute right, but must be considered in relation to its function in society and be balanced against other fundamental rights, in accordance with the principle of proportionality.” The Court concluded that, while the EU struck that balance within its union, “it has not, to date, struck such a balance as regards the scope of a de-referencing outside of the union.”

    Courts Privacy/Cyber Risk & Data Security European Union Of Interest to Non-US Persons

  • U.S. Treasury concerned with European Commission's identification of AML/CFT-deficient U.S. territories

    Financial Crimes

    On February 13, the U.S. Treasury Department issued a statement responding to a list of jurisdictions published by the European Commission as having strategic deficiencies related to anti-money laundering and countering the financing of terrorism (AML/CFT). The list—which includes certain jurisdictions with strategic deficiencies that were already identified by the Financial Action Task Force (FATF) (see previous InfoBytes coverage here)—also identifies 11 additional jurisdictions, including the U.S. territories of American Samoa, Guam, Puerto Rico, and the U.S. Virgin Islands. According to the European Commission, the “banks and other entities covered by EU anti-money laundering rules will be required to apply increased checks (due diligence) on financial operations involving customers and financial institutions from these high-risk third countries to better identify any suspicious money flows.”

    Financial Crimes Department of Treasury European Union Of Interest to Non-US Persons Anti-Money Laundering Combating the Financing of Terrorism FATF

Pages

Upcoming Events