Goto

Collaborating Authors

 adverse action


Artificial Intelligence Briefing: CFPB Weighs in on Algorithmic Transparency

#artificialintelligence

Consumer Financial Protection Bureau (CFPB) issues policy statement on credit decisions based on complex algorithms. On May 26, the CFPB issued Circular 2022-03, which addresses an important question about algorithmic decision-making: "When creditors make credit decisions based on complex algorithms that prevent creditors from accurately identifying the specific reasons for denying credit or taking other adverse actions, do these creditors need to comply with the Equal Credit Opportunity Act's requirement to provide a statement of specific reasons to applicants against whom adverse action is taken?" The Circular says yes, compliance with ECOA and Regulation B is required even if complex algorithms (including AI and machine learning) make it difficult to accurately identify the specific reasons for taking the adverse action. Further, the Circular makes clear that those laws "do not permit creditors to use complex algorithms when doing so means they cannot provide the specific and accurate reasons for adverse actions." White House executive order calls for study of predictive algorithms used by law enforcement agencies.


FTC's Tips on Using Artificial Intelligence and Algorithms

#artificialintelligence

Artificial intelligence (AI) technology that uses algorithms to assist in decision-making offers tremendous opportunity to make predictions and evaluate "big data." The Federal Trade Commission (FTC), on April 8, 2020, provided reminders in its Tips and Advice blog post, Using Artificial Intelligence and Algorithms. This is not the first time the FTC has focused on data analytics. In 2016, it issued a "Big Data" Report. AI technology may appear objective and unbiased, but the FTC warns of the potential for unfair or discriminatory outcomes or the perpetuation of existing socioeconomic disparities.