Skip to main content
Menu Icon
Close

InfoBytes Blog

Financial Services Law Insights and Observations

Filter

Subscribe to our InfoBytes Blog weekly newsletter and other publications for news affecting the financial services industry.

  • District Court: Unclear when networking site became aware of data scraping

    Privacy, Cyber Risk & Data Security

    On November 3, the U.S. District Court for the Northern District of California issued an order ruling on cross-motions for summary judgment in an action concerning whether a now-defunct plaintiff data analytics company breached a user agreement with a defendant professional networking site by using an automated process to extract user data (a process known as “scraping”) for the purposes of selling its analytics services to businesses. The defendant claimed that the user agreement prohibits scraping, and sent the plaintiff a cease-and-desist letter demanding it stop and alleging violations of the Computer Fraud and Abuse Act (CFAA) as well as various state laws. In response, the plaintiff sued the defendant, arguing that it had a right to access the public pages, and later sought a preliminary injunction, which the district court granted.

    As previously covered by InfoBytes, earlier this year, the U.S. Court of Appeals for the Ninth Circuit, on remand from the U.S. Supreme Court, affirmed the district court’s order preliminarily enjoining the defendant from denying the plaintiff access to publicly available member profiles. The 9th Circuit had previously affirmed the preliminary injunction, but was called to further consider whether the CFAA applies to the plaintiff’s data scraping after the U.S. Supreme Court vacated the appellate court’s judgment in light of its ruling in Van Buren v. United States. The 9th Circuit found that the ruling in Van Buren, in which the Supreme Court suggested the CFAA only applies in cases where someone is accused of hacking into or exceeding their authorized access to a network that is protected, or in situations where the “gates are up,” narrowed the CFAA’s scope and most likely did not apply to cases involving data scraped in bulk by automated bots from public websites. The appellate court concluded, among other things, that the defendant showed that it “currently has no viable way to remain in business other than using [the networking site’s] public profile data” for its analytic services and “demonstrated a likelihood of irreparable harm absent a preliminary injunction.” Moreover, the 9th Circuit rejected the defendant’s claims that the plaintiff violated the CFAA.

    In partially granting the defendant’s motion and denying the plaintiff’s, the district court ruled that the plaintiff breached its user agreement by directing the creation of fake accounts and copying of url data as part of its scraping process. Nonetheless, the district court noted there remains a legitimate dispute over whether the defendant waived its right to enforce the user agreement after the plaintiff openly discussed its business model, including its reliance on scraping, at conferences it organized that were attended by defendant’s executives. Moreover, questions remain for trial as to when the defendant became aware of the plaintiff’s scaping, whether it should have taken “steps to legally enforce against known scraping” sooner, and whether the defendant can raise certain defenses to its breach of contract claim tied to the plaintiff’s data scraping and unauthorized use of data.

    Privacy, Cyber Risk & Data Security Courts Data Scraping Consumer Protection Computer Fraud and Abuse Act State Issues California Appellate Ninth Circuit

  • CPPA says comments on modified draft privacy rules due November 21

    Privacy, Cyber Risk & Data Security

    On November 3, the California Privacy Protection Agency (CPPA) Board officially posted updated draft rules for implementing the Consumer Privacy Rights Act of 2020, which amends and builds on the California Consumer Privacy Act of 2018. The draft rules were previously released in advance of a CPPA Board meeting held at the end of October (see previous InfoBytes coverage here for a detailed breakdown of the proposed changes). A few notable changes between the versions include:

    • A requirement that a business must treat an opt-out preference signal as a valid request to opt out of sale/sharing for not only that browser or device but also for “any consumer profile associated with that browser or device, including pseudonymous profiles.”
    • A requirement that if a business does not ask a consumer to affirm their intent with regard to a financial incentive program, “the business shall still process the opt-out preference signal as a valid request to opt-out of sale/sharing for that browser or devise and any consumer profile the business associates with that browser or device.” However if a consumer submits an opt-out of sale/sharing request but does not affirm their intent to withdraw from a financial incentive program, the business may ignore the opt-out preference signal with respect to the consumer’s participation in the financial incentive program.
    • The addition of the following provision: “As part of the Agency’s decision to pursue investigations of possible or alleged violations of the CCPA, the Agency may consider all facts it determines to be relevant, including the amount of time between the effective date of the statutory or regulatory requirement(s) and the possible or alleged violation(s) of those requirements, and good faith efforts to comply with those requirements.”

    Comments on the amended draft rules are due November 21 by 8 am PT.

    Privacy, Cyber Risk & Data Security State Issues CPPA CCPA CPRA Agency Rule-Making & Guidance Consumer Protection

  • FTC’s annual PrivacyCon focuses on consumer privacy and security issues

    Privacy, Cyber Risk & Data Security

    On November 1, the FTC held its annual PrivacyCon event, which hosted research presentations on a wide range of consumer privacy and security issues. Opening the event, FTC Chair Lina Khan stressed the importance of hearing from the academic community on topics related to a range of privacy issues that the FTC and other government bodies may miss. Khan emphasized that regulators cannot wait until new technologies fully emerge to think of ways to implement new laws for safeguarding consumers. “The FTC needs to be on top of this emerging industry now, before problematic business models have time to solidify,” Khan said, adding that the FTC is consistently working on privacy matters and is “prioritizing the use of creative ideas from academia in [its] bread-and-butter work” to craft better remedies to reflect what is actually happening. She highlighted a recent enforcement action taken against an online alcohol marketplace and its CEO for failing to take reasonable steps to prevent two major data breaches (covered by InfoBytes here). Khan noted that while the settlement’s requirements, such as imposing multi-factor authentication requirements and destroying unneeded user data, may not sound “very cutting-edge” they serve as a big step forward for government enforcers. Chief Technology Officer Stephanie Nguyen, who is responsible for leading the charge to integrate technologists across the FTC’s various lines of work, including consumer privacy, discussed the work of these technologists (including AI and security experts, software engineers, designers, and data scientists) to help develop remedies in data security-related enforcement actions and to push companies to not just do the minimum to remediate areas like unreasonable data security but to model best practices for the industry. “We want to see bad actors face real consequences,” Nguyen said, adding that the FTC wants to hold corporate leadership accountable as it did in the enforcement action Khan cited. Nguyen further stressed that there is also a need to address systemic risk by making companies delete illegally collected data and destroy any algorithms derived from the data.

    The one-day conference featured several panel sessions covering a number of topics related to consumer surveillance, automated decision-making systems, children’s privacy, devices that listen to users, augmented/virtual reality, interfaces and dark patterns, and advertising technology. Topics addressed during the panels include (i) requiring data brokers to provide accurate information; (ii) understanding how data inaccuracies can disproportionately affect minorities and those living in poverty, and why relying on this data can lead to discriminatory practices; (iii) examining bias and discrimination risks when engaging in emotional artificial intelligence; (iv) understanding automated decision making systems and how the quality of these systems impact populations they are meant to represent; (v) recognizing the lack of transparency related to children’s data collection and use, and the impact various privacy laws, including the Children’s Online Privacy Protection Rule, the General Data Protection Regulation, and the California Consumer Privacy Act, have on the collection/use/sharing of personal data; (vi) recognizing challenges related to cookie-consent interfaces and dark patterns; and (vii) examining how targeted online advertising both in the U.S. and abroad affects consumers.

    Privacy, Cyber Risk & Data Security FTC Consumer Protection Artificial Intelligence Dark Patterns Enforcement

  • FTC fines ISP $100 million for dark patterns and junk fees

    Federal Issues

    On November 3, the FTC announced an action against an internet phone service provider claiming the company imposed “junk fees” and made it difficult for consumers to cancel their services. The FTC alleged in its complaint that the company violated the FTC Act and the Restore Online Shoppers’ Confidence Act by imposing a series of obstacles, sometimes referred to as “dark patterns”, to deter and prevent consumers from canceling their services or stopping recurring charges. Consumers who were able to sign up for services online were allegedly forced to speak to a live “retention agent” on the phone during limited working hours in order to cancel their services. The company also allegedly employed a “panoply of hurdles” to cancelling consumers by, among other things, making it difficult for the consumer to locate the phone number on the website, obscuring contact information, failing to consistently transfer consumers to the appropriate number, imposing lengthy wait times, holding reduced operating hours for the cancellation line, and failing to provide promised callbacks. Additionally, the FTC claimed the company often informed consumers they would have to pay an early termination fee (sometimes hundreds of dollars) that was not clearly disclosed when they signed up for the services, and continued to illegally charge consumers without consent even after they requested cancellation. According to the FTC, consumers who complained often only received partial refunds.

    Under the terms of the proposed stipulated order, the company will be required to take several measures, including (i) obtaining consumers’ express, informed consent to charge them for services; (ii) simplifying the cancellation process to ensure it is easy to find and use and is available through the same method the consumer used to enroll; (iii) ending the use of dark patterns to impede consumers’ cancellation efforts; and (iv) being transparent about the terms of any negative option subscription plans, including providing required disclosures as well as a simple mechanism for consumers to cancel the feature. The company will also be required to pay $100 million in monetary relief.

    Federal Issues FTC Enforcement Junk Fees Dark Patterns Consumer Finance Consumer Protection FTC Act ROSCA

  • Republican senators oppose FTC’s ANPR on data privacy and security

    Federal Issues

    On November 3, three Republican Senators sent a letter to FTC Chair Lina Khan expressing their opposition to the FTC’s Advanced Notice of Proposed Rulemaking (ANPR) for the Trade Regulation Rule on Commercial Surveillance and Data Security. As previously covered by InfoBytes, in August the FTC announced the ANPR covering a wide range of concerns about commercial surveillance practices, specifically related to the business of collecting, analyzing, and profiting from information about individuals. In the letter, the Senators argued that both consumers and businesses would benefit if Congress enacted comprehensive federal legislation addressing data privacy. According to the Senators, the FTC “lacks the authority to create preemptive standards” and the proposed rulemaking “would only add uncertainty and confusion to an already complicated regulatory landscape, increasing compliance costs, reducing competition, and ultimately harming consumers.” The Senators requested that the FTC withdraw its rulemaking proposal, explaining that “[c]onsumer data privacy and security are complex issues which will require standards that are robust, adaptive, and can balance the interests of consumers with the needs of businesses.” The Senators noted that they believe “that this balance can only be struck within federal legislation that is comprehensive and preemptive, such that the law creates a single national standard.”

    Federal Issues Privacy, Cyber Risk & Data Security Agency Rule-Making & Guidance FTC U.S. Senate Consumer Protection

  • FTC takes action against ed tech provider for lax data security

    Federal Issues

    On October 31, the FTC announced an administrative action against an education technology (ed tech) provider claiming that the company’s allegedly poor data security practices exposed millions of users and employees’ sensitive information, including Social Security numbers, email addresses, and passwords. According to the FTC’s complaint, due to the company’s alleged failure to adequately protect the personal information collected from its users and employees, the company experienced four data breaches beginning in September 2017, when a phishing attack granted a hacker access to employees’ direct deposit information. Less than a year later, another data breach involved a former employee using login information the company shared with employees and outside contractors to gain access to a third-party cloud database containing personal data for roughly 40 million users. In the following two years, the company experienced two more data breaches through phishing attacks that exposed sensitive employee data, including medical and financial information. Claiming violations of Section 5(a) of the FTC Act, the Commission alleged the company failed to implement basic security measures, stored personal data insecurely, and failed to implement a written security policy until January 2021, despite experiencing three phishing attacks.

    Under the terms of the proposed decision and order, the company would be required to take several measures to address the alleged conduct, including (i) documenting and limiting data collection; (ii) providing users access to collected data and allowing them to submit requests for deletion; (iii) implementing multifactor authentication or another authentication method to protect user and employee accounts; and (iv) implementing a comprehensive information security program that would encrypt consumer data and provide security training to employees, among other things.

    This action is part of the FTC’s ongoing efforts to make sure ed tech providers protect and secure personal data they collect and do not collect more information than necessary. As previously covered by InfoBytes, the FTC issued a policy statement in May warning ed tech providers that they must fully comply with all provisions of the Children’s Online Privacy Protection Act when gathering data about children. The FTC emphasized that ed tech providers may not harvest or monetize children’s data, cannot force children to disclose more information than is reasonably necessary for participating in their educational services, and must have procedures in place to keep the data secure, among other things.

    Federal Issues Privacy, Cyber Risk & Data Security FTC Enforcement FTC Act UDAP COPPA Data Breach Consumer Protection

  • EU Court of Justice says controllers of personal data must take reasonable steps to inform third parties when consumer consent is withdrawn

    Privacy, Cyber Risk & Data Security

    On October 27, the European Court of Justice (ECJ) held that controllers of personal data must take reasonable steps to inform other controllers when a data subject withdraws consent. The decision stems from a request made by a subscriber to a Belgian telecommunications provider to not have his information included in the public telephone directories and directory inquiry services published by the company and other third parties. The controller pulled the subscriber’s information from the public record, but re-added the information to the directories after it received an update to the subscriber’s data that was not noted as being confidential. The subscriber submitted multiple requests for his data to be removed and submitted a complaint with the Belgian Data Protection Authority. The Data Protection Authority ordered the company to take remedial action and fined it €20,000 for infringing several provisions of the General Data Protection Regulation (GDPR). The controller appealed, “arguing that the consent of the subscriber is not required for the purposes of the publication of his or her personal data in the telephone directories, rather the subscribers must themselves request not to be included in those directories under an ‘opt-out’ system. In the absence of such a request, the subscriber concerned may in fact be included in those directories.” The Data Protection Authority contended, however, that the privacy and electronic communications directive “requires the ‘consent of subscribers’ within the meaning of the GDPR in order for the providers of directories to be able to process and pass on their personal data.”

    The Brussels Court of Appeal referred questions to the ECJ for a preliminary ruling after determining that there are no specific rules “concerning the withdrawal by a subscriber of his or her statement of wishes or of that ‘consent.’” The ECJ determined that controllers of personal data must get consumers’ informed consent before publishing their information in a public directory. Further, the ECJ determined that such consent can be extended to any subsequent processing of data by third parties, provided the data is processed for the same purpose to which the consumer consented. However, consumers can withdraw consent at any time, and controllers are required to make reasonable efforts to notify third parties, including search engine providers, that are making use of that subscriber’s information of the withdrawal. Notably, the ECJ concluded that if various controllers rely on the single consent of a data subject, “it is sufficient, in order for that person to withdraw such consent, that he or she contacts any one of the controllers.”

    Privacy, Cyber Risk & Data Security Of Interest to Non-US Persons EU Courts GDPR Enforcement Consumer Protection

  • FTC to issue rulemaking on junk fees and fake reviews

    Federal Issues

    On October 20, the FTC voted 3-1 at an open meeting to publish two rules for comments: the Advance Notice of Proposed Rulemaking (ANPRM) on Junk Fees (see here) and the ANPRM on Fake Reviews and Endorsements (see here). The first ANPRM addresses junk fees that are charged for goods or services that have little or no added value to the consumer. The ANPRM seeks comments on the prevalence of junk fees and the consumer harms arising from junk fee practices, among other topics. The second APNRM initiates a rulemaking proceeding addressing fake reviews and other endorsements, which can cheat consumers and honest businesses alike. The ANPRM seeks comment on the prevalence of fake and deceptive reviews and the consumer harms arising from them, among other things.

    At the start of the meeting, members of the public provided feedback on the Commission’s work with some members of the public expressing concerns about how junk fees are harming consumers and businesses. Others also expressed consumers’ frustration with hidden fees that are added to bills that were not advertised up front. Regarding fake advertisements, some emphasized how consumers rely on reviews and how fake reviews can harm consumers and sellers. Commissioner Wilson, the sole ‘no’ vote on both measures, noted that the APNRM on junk fees “is sweeping in its breadth,” and said the APNRM potentially contradicts existing laws and rules, among other things. Chair Kahn, Commissioner Slaughter, and Commissioner Bedoya all voted yes for both measures. Regarding the junk fees ANPR, Commissioner Slaughter mentioned that she does not consider this to be “obscure” and expressed her support for the ANPRM, emphasizing that markets cannot function effectively with junk fees. Commissioner Wilson noted that she agrees that “fake and deceptive reviews are unlawful,” but does not believe public comment should be sought for this proposal because “the Commission already has a multi-pronged strategy in place to combat this issue,” such as FTC-published endorsement guides. Additionally, in October 2021, the Commission issued a notice of penalty offenses, which is explained in the ANPRM, and may enable the Commission to obtain civil penalties from marketers that use fake reviews.

    Federal Issues Agency Rule-Making & Guidance FTC Junk Fees Endorsements Consumer Protection UDAP

  • California’s privacy agency amends draft privacy rules ahead of meeting

    Privacy, Cyber Risk & Data Security

    In advance of an upcoming meeting of the California Privacy Protection Agency Board (CPPA) scheduled for October 28-29, the agency posted updated draft rules for implementing the California Privacy Rights Act (CPRA). As previously covered by InfoBytes, the CPRA (largely effective January 1, 2023, with enforcement delayed until July 1, 2023) was approved by ballot measure in November 2020 to amend and build on the California Consumer Privacy Act (CCPA). In July, the California Privacy Protection Agency initiated formal rulemaking procedures to adopt proposed regulations implementing the CPRA (covered by InfoBytes here).

    The proposed changes to the draft rules respond to comments received during the 45-day comment period, in which several businesses expressed concerns that the requirements were confusing and complying would be costly. (See also Explanation of Modified Text of Proposed Regulations.) Key clarifying modifications include:

    • Adding, amending, and striking certain definitions. The proposed changes would, among other things, revise the definition of “disproportionate effort” to clarify that it applies to service providers, contractors, and third parties as well as to businesses. The revisions also provide additional details concerning factors that should be considered when evaluating whether responding to a consumer request would require disproportionate effort. The changes also add and amend terms such as “first party,” “information practices,” “nonbusiness,” “privacy policy,” and “unstructured.”
    • Outlining restrictions on how a consumer’s personal information is collected or used. The revisions propose criteria for how a business should evaluate the “reasonable expectation” of consumers concerning the collection or processing of their personal information, including how to determine the purpose for which the personal information is collected, whether it is reasonably necessary and proportionate for achieving the stated purposes, and whether it is a “business purpose” under the CCPA/CPRA. According to the CPPA’s explanation of the modified text, the “factors consider relevant GDPR principles for harmonization while articulating the statutory requirements and intent of the CCPA.”
    • Providing disclosure and communications requirements. The proposed changes clarify that conspicuous links for websites should appear in a similar manner as other similarly-posted links, and provide guidance on the placement of conspicuous links in a mobile environment.
    • Clarifying requirements for obtaining consumer consent. The revisions explain how different user interfaces and “choice architecture” can impair or interfere with a consumer’s ability to make a choice, and thus fail to meet the definition of consent. The revisions further address provisions related to dark patterns, explaining that “[i]f a business did not intend to design the user interface to subvert or impair user choice, but the business knows of and does not remedy a user interface that has that effect, the user interface may still be a dark pattern. Similarly, a business’s deliberate ignorance of the effect of its user interface may also weigh in favor of establishing a dark pattern.”
    • Amending requirements related to a business’s privacy notice. The revisions eliminate requirements for a business to either disclose the names or business practices of third parties that the business allows to collect personal information from the consumer in the business’s notice at collection. Additionally, a business and third party may provide a single notice at collection that outlines the required information about their collective information practices.
    • Amending the right to limit the use/disclosure of sensitive personal information. The proposed changes clarify that a business does not need to provide a notice of right to limit the use of sensitive personal information if the business only collects or processes sensitive personal information without the purpose of inferring characteristics about a consumer. Additionally, the revisions would make it optional for businesses to provide a means by which consumers can confirm their request to limit in order to simplify implementation at this time.
    • Clarifying request to delete provisions. The revisions confirm that a business’s service provider or contractor may delete collected personal information pursuant to the written contract that it has with the business. Additionally, businesses will be permitted to provide a link to a support page or other resource that explains a consumer’s data deletion options.
    • Amending requests to correct/know. The proposed changes clarify that businesses, service providers, and contractors may delay compliance with requests to correct with respect to information stored on archived or backup systems. The amendments also, among other things, clarify that consumers should make good-faith efforts to provide businesses with all relevant information available at the time of the request, provide flexibility and discretion to a business concerning whether it will provide the consumer with the name of the source from which the business received the alleged inaccurate information, and clarify that a business only needs to disclose specific pieces of personal information that it maintains and has collected about the consumer in order to confirm that the business has corrected the inaccurate information that was the subject of the consumer’s request to correct. With respect to a consumer’s right to know, the proposed changes would allow a consumer to request a specific time period for which their request to know applies.
    • Amending opt-out preference signals. The proposed changes specify that a business that does not sell or share personal information is not required to process an opt-out preference signal as a valid request to opt-out. However, for businesses that do sell or share personal information, processing the opt-out preference signal means that the business is treating it as a valid request to opt-out of sale/sharing. The revisions also address when a business can ignore an opt-out signal to allow a consumer to continue to participate in a financial incentive program, and explain that when a consumer is known to the business, the “business shall not interpret the absence of an opt-out preference signal after the consumer previously sent an opt-out preference signal as consent to opt-in to the sale or sharing of personal information.” Moreover, a business may choose to display whether it has processed the consumer’s optout preference signal as a valid request to opt-out of sale/sharing on its website.
    • Amending requests to opt-out of sale/sharing. The revisions, among other things, clarify that, at a minimum, a business shall allow consumers to submit requests to opt-out of sale/sharing through an opt-out preference signal and through one of the following methods—an interactive form accessible via the “Do No Sell or Share My Personal Information” link, the Alternative Opt-out Link, or the business’s privacy policy. The revisions also make various changes related to service provider, contractor, and third-party obligations.
    • Clarifying requests to limit use and disclosure of sensitive personal information. The revisions clarify how sensitive personal information may be used to “prevent, detect, and investigate” security incidents “even if this business purpose is not specified in the written contract required by the CCPA and these regulations.”

    The proposed changes also delete examples concerning notices of the right to opt-out of the sale/sharing of personal information through connected devices and augmented or virtual reality to simplify implementation at this time. Additionally, the proposed changes further clarify provisions related to requirements for service providers, contractors, and third parties, specifying, among other things, that businesses must contractually require these entities to provide the same level of privacy protection as is required of businesses by the CCPA and these regulations.

    Privacy, Cyber Risk & Data Security State Issues California CPPA CPRA CCPA Consumer Protection Agency Rule-Making & Guidance

  • Biden issues executive order on EU-U.S. privacy shield replacement

    Privacy, Cyber Risk & Data Security

    On October 7, President Biden signed an Executive Order on Enhancing Safeguards for United States Signals Intelligence Activities (E.O.) to address the facilitation of transatlantic data flows between the EU and the U.S. The E.O. outlines commitments the U.S. will take under the EU-U.S. Data Privacy Framework, which was announced in March as a replacement for the invalidated EU-U.S. Privacy Shield. As previously covered by InfoBytes, the Court of Justice of the EU (CJEU) issued an opinion in the Schrems II case (Case C-311/18) in July 2020, holding that the EU-U.S. Privacy Shield did not satisfy EU legal requirements. In annulling the EU-U.S. Privacy Shield, the CJEU determined that because the requirements of U.S. national security, public interest, and law enforcement have “primacy” over the data protection principles of the EU-U.S. Privacy Shield, the data transferred under the EU-U.S. Privacy Shield would not be subject to the same level of protections prescribed by the GDPR.

    Among other things, the E.O. bolsters privacy and civil liberty safeguards for U.S. signals intelligence-gathering activities, and establishes an “independent and binding mechanism” to enable “qualifying states and regional economic integration organizations, as designated under the E.O., to seek redress if they believe their personal data was collected through U.S. signals intelligence in a manner that violated applicable U.S. law.” Specifically, the E.O. (i) creates further safeguards for how the U.S. signals intelligence community conducts data transfers; (ii) establishes requirements for handling personal information collected through signals intelligence activities and “extends the responsibilities of legal, oversight, and compliance officials to ensure that appropriate actions are taken to remediate incidents of non-compliance”; (iii) requires the U.S. signals intelligence community to make sure policies and procedures reflect the E.O.’s new privacy and civil liberty safeguards; (iv) establishes a multi-layer review and redress mechanism, under which the Civil Liberties Protection Officer in the Office of the Director of National Intelligence (CLPO) is granted the authority to investigate complaints of improper collection and handling of personal data and may issue binding decisions on whether improper conduct occurred and what the appropriate remediation should be; (v) directs the U.S. attorney general to establish a Data Protection Review Court (DPRC) to independently review CLPO decisions, thereby serving as the second level of the E.O.’s redress mechanism (see DOJ announcement here); and (vi) calls on the Privacy and Civil Liberties Oversight Board to review U.S. signals intelligence community policies and procedures to ensure they are consistent with the E.O.

    Privacy, Cyber Risk & Data Security Federal Issues Biden EU Consumer Protection EU-US Privacy Shield Of Interest to Non-US Persons GDPR EU-US Data Privacy Framework

Pages

Upcoming Events