Skip to main content
Menu Icon
Close

InfoBytes Blog

Financial Services Law Insights and Observations

Filter

Subscribe to our InfoBytes Blog weekly newsletter and other publications for news affecting the financial services industry.

  • NYDFS offers guidance to insurers on AI models

    State Issues

    On January 17, NYDFS issued a guidance letter on artificial intelligence (AI) intended to help licensed insurers understand NYDFS’s expectations for combating discrimination and bias when using AI in connection with underwriting. The guidance is aimed at all insurers authorized to write insurance in New York State and is intended to help insurers develop AI systems, data information systems, and predictive models while “mitigat[ing] potential harm to consumers.”

    The guidance letter states that while the use of AI can potentially result in more accurate underwriting and pricing of insurance, AI technology can also “reinforce and exacerbate” systemic biases and inequality. As part of the letter’s fairness principles, NYDFS states that an insurer should not use underwriting or pricing technologies “unless the insurer can establish that the data source or model… is not biased in any way” with respect to any class protected pursuant to New York insurance law. Further, insurers are expected to demonstrate that technology-driven underwriting and pricing decisions are supported by generally accepted actuarial standards of practice and based on actual or reasonably anticipated experience. It was last noted that these rules build on New York Governor Hochul’s statewide policies governing AI.

    State Issues NYDFS Artificial Intelligence GAAP Racial Bias Discrimination Insurance Underwriting

  • NY AG report reveals racial disparities in homeownership and offers proposed solutions

    State Issues

    On October 31, New York AG Letitia James released a report detailing racial disparities in homeownership and access to home financing in New York. The report states that Black and Latino New Yorkers are “underrepresented” among mortgage applicants, and white households are overall more likely to own homes than Black, Latino, or Asian households. The report also found that regardless of credit score, income, size of the loan and other factors, all applicants of color are denied mortgages at a higher rate than white applicants. In addition, the report found that disparities between white borrowers and borrowers of color persist in the context of refinance transactions and are also present in loans made by “[n]ew private-sector, non-depository lenders.”

    The report identified policy solutions that could reduce these disparities, including (i) subsidizing down payments and interest rates for first-generation homebuyers; (ii) increasing state funding for nonprofit financial institutions that support underserved communities of color; (iii) passing the New York Public Banking Act, which would create a regulatory framework for the establishment of public banks, thereby expanding access to affordable financial services in underserved areas; (iv) bolstering resources for government agencies to conduct fair lending investigations and enhancing New York’s Human Rights Law to explicitly prohibit discriminatory lending practices; and (v) exploring options for offering state-provided banking services in accessible locations to increase access to traditional banking services.

    State Issues New York State Attorney General Fair Lending Consumer Finance Lending FHA Refinance Racial Bias

  • Federal agencies reaffirm commitment to confront AI-based discrimination

    Federal Issues

    On April 25, the CFPB, DOJ, FTC, and Equal Employment Opportunity Commission released a joint statement reaffirming their commitment to protect the public from bias in automated systems and artificial intelligence (AI). “America’s commitment to the core principles of fairness, equality, and justice are deeply embedded in the federal laws that our agencies enforce to protect civil rights, fair competition, consumer protection, and equal opportunity,” the agencies said, emphasizing that existing authorities apply equally to the use of new technologies and responsible innovation as they do to any other conduct. The agencies have previously expressed concerns about potentially harmful AI applications, including black box algorithms, algorithmic marketing and advertising, abusive AI technology usage, digital redlining, and repeat offenders’ use of AI, which may contribute to unlawful discrimination, biases, and violate consumers’ rights.

    “We already see how AI tools can turbocharge fraud and automate discrimination, and we won’t hesitate to use the full scope of our legal authorities to protect Americans from these threats,” FTC Chair Lina M. Khan said. “Technological advances can deliver critical innovation—but claims of innovation must not be cover for lawbreaking. There is no AI exemption to the laws on the books, and the FTC will vigorously enforce the law to combat unfair or deceptive practices or unfair methods of competition,” Khan added.

    CFPB Director Rohit Chopra echoed Khan’s sentiments and said the Bureau, along with other agencies, are taking measures to address unchecked AI. “While machines crunching numbers might seem capable of taking human bias out of the equation, that’s not what is happening,” Chopra said. “When consumers and regulators do not know how decisions are made by artificial intelligence, consumers are unable to participate in a fair and competitive market free from bias,”  Chopra added. The Director’s statements concluded by noting that the Bureau will continue to collaborate with other agencies to enforce federal consumer financial protection laws, regardless of whether the violations occur through traditional means or advanced technologies.

    Additionally, Assistant Attorney General Kristen Clarke of the DOJ’s Civil Rights Division noted that “[a]s social media platforms, banks, landlords, employers and other businesses [] choose to rely on artificial intelligence, algorithms and other data tools to automate decision-making and to conduct business, we stand ready to hold accountable those entities that fail to address the discriminatory outcomes that too often result.”

    Federal Issues FTC CFPB DOJ Artificial Intelligence EEOC Discrimination Consumer Finance Racial Bias Fintech

  • Treasury highlights strategy to advance racial equity

    Federal Issues

    On October 25, the U.S. Treasury Department released a blog post that highlights how the Department is focusing on advancing racial equity. Among other things, the blog noted that this focus has informed the Treasury’s decision to establish “a dedicated Office of Recovery Programs and has flowed through the policy and operational decisions [it has] made to implement the historic American Rescue Plan.” According to the blog, the Office of Recovery Programs addresses urgent needs and makes lasting investments to mitigate long-term disparities by making equity a foundational priority in the delivery of the program, which has improved the circumstances of vulnerable households and created opportunities for small businesses, cities, and states. In addition, Treasury announced the appointment of Janis Bowdler to be the Department’s first Counselor for Racial Equity. The blog also noted that Treasury’s “efforts go beyond [Treasury’s] diverse, dedicated political appointees,” because Treasury is “also deeply committed to improving diversity and inclusion among the broader career Treasury workforce, where we acknowledge much more work remains to be done.”

    Federal Issues Diversity Racial Bias Department of Treasury

  • Acting comptroller discusses bias in appraisals

    Federal Issues

    On June 15, OCC acting Comptroller Michael J. Hsu delivered remarks during the CFPB’s Virtual Home Appraisal Bias Event to raise awareness on the importance of reducing bias in real estate appraisals. The event included discussions with civil rights organizations, housing policy experts, and other federal agencies on how bias can occur in real estate appraisals and automated valuation models. Biased appraisals, Hsu noted, have a large impact on lending and contribute to inequity in housing values. He pointed to data from studies showing that homes in Black neighborhoods are valued at approximately half the price as homes in neighborhoods with few or no Black residents. This difference has created a $156 billion cumulative loss in value across the country for majority-Black neighborhoods, Hsu stated. He further emphasized that “[w]hile appraisers and the appraisal process are not often seen as parts of the banking system, there are clear intersections. Banking regulations require appraisals on certain transactions, and banks rely on third-party appraisals in their underwriting and overall risk management practices. Regulators, including the OCC, expect banks to ensure their vendors treat customers fairly and do not discriminate, and we are seeing banks held accountable for discrimination in appraisals they use.” Hsu added that holding banks accountable, while necessary, is not enough to solve the problem of biased appraisals, and that a solution will require collaboration between all stakeholders, including the attendees participating in the Bureau’s event.

    Federal Issues OCC CFPB Appraisal Racial Bias Disparate Impact Consumer Finance Bank Regulatory

Upcoming Events