Skip to main content
Menu Icon
Close

InfoBytes Blog

Financial Services Law Insights and Observations

Massachusetts’ attorney general issues AI guidance related to state UDAP law

Privacy, Cyber Risk & Data Security Massachusetts State Attorney General Artificial Intelligence UDAP CFPB

Privacy, Cyber Risk & Data Security

On April 16, the Attorney General for Massachusetts (AG) released an advisory notice on how developers, suppliers and users of artificial intelligence (AI) should avoid “unfair and deceptive” practices to comply with consumer protection laws. The AG noted how AI systems could pose consumer harms, including through bias, lack of transparency, and data privacy issues – since consumers often lack the ability to avoid or test the “appropriateness” of AI systems forced upon them. Chapter 93A of Massachusetts law, the Massachusetts Consumer Protection Act, protected consumers against “unfair and deceptive” practices, the definition of which has changed over time. In addition to the consumer protection law, the AG highlighted several other state and federal consumer protections, including the ECOA, to bolster her advisory.

The AG’s advisory construed Chapter 93A to apply to AI, clarifying that the following practices may qualify as “unfair or deceptive”: (i) a company falsely advertising the quality of its AI systems; (ii) a company suppling a defective or impractical AI system; (iii) a company misrepresenting the reliability or safety of its AI system; (iv) a company putting an AI system up for sale in breach of warranty, meaning that the system was unfit for the purpose for which it was sold; (v) a company using multimedia content to impersonate or deceive (such as using a deep fake, voice cloning, or chatbots within fraud); (vi) or a company failing to comply with other Massachusetts’ statutes.