Estimated reading time: 2 minutes, 40 seconds

U.S. Agencies Issue Warning About AI Bias in Automated Banking, Marketing Services

U.S. regulators are warning AI financial tools marketed as "bias-reducing" may actually yield unlawful discrimination. To that end, the Consumer Financial Protection Bureau (CFPB), the Federal Trade Commission (FTC), the Civil Rights Division of the United States Department of Justice and the U.S. Equal Employment Opportunity Commission issued a joint statement calling for a commitment to “the core principles of fairness, equality, and justice” as new AI-driven fintech solutions continue to be explored.

The regulatory and enforcement agencies pointed to potential issues arising in areas like fair competition, equal opportunity, civil rights and consumer protection in their call to action. Each of the agencies has already expressed some concerns with new automated processes and has resolved to “vigorously enforce” regulations designed to protect consumers.

“We already see how AI tools can turbocharge fraud and automate discrimination, and we won’t hesitate to use the full scope of our legal authorities to protect Americans from these threats,” said FTC Chair Lina M. Khan. “Technological advances can deliver critical innovation—but claims of innovation must not be cover for lawbreaking. There is no AI exemption to the laws on the books, and the FTC will vigorously enforce the law to combat unfair or deceptive practices or unfair methods of competition.”

From Twitter

The Highway To AI @TheHighway2AI ·Apr 27

"In the transition to smart operations, banks will see organizational efficiency, better risk management, frictionless customer experience, and will be able to generate revenue through new products and services. RT #infographic by @antgrasso #finserv #fintech #IntelligentAutoma"

Among some of the potentially problematic technological solutions cited by organizations include “black box algorithms,” which are often too complex, opaque and vague to appropriately justify the adverse decisions they are rendering. Similarly, “digital redlining,” which can lead to algorithmic biases in the home appraisal and valuation process was also cited as another potential danger associated with the technology.

According to the joint statement, many automated systems use large swaths of data to identify correlations and patterns, and in turn, use those patterns to perform tasks and make predictions and recommendations. It is in these processes where unlawful discrimination might occur.

Some automated systems might be skewed by unrepresentative datasets, sets that incorporate inherent historical bias and other similar errors, adds the statement. “Developers do not always understand or account for the contexts in which private or public entities will use their automated systems,” it reads. “Developers may design a system on the basis of flawed assumptions about its users, relevant context, or the underlying practices or procedures it may replace.”

The CFPB has proposed a registry to keep tabs on entities it identifies as being in continual violation of equitable AI practices. “Technology marketed as AI has spread to every corner of the economy, and regulators need to stay ahead of its growth to prevent discriminatory outcomes that threaten families’ financial stability,” said CFPB Director Rohit Chopra. “Today’s joint statement makes it clear that the CFPB will work with its partner enforcement agencies to root out discrimination caused by any tool or system that enables unlawful decision making.”

This spring, the bureau is also planning to release a white paper detailing limitations in implementing chatbot solutions into financial institutions.

 

Read 652 times
Rate this item
(0 votes)

Visit other PMG Sites:

PMG360 is committed to protecting the privacy of the personal data we collect from our subscribers/agents/customers/exhibitors and sponsors. On May 25th, the European's GDPR policy will be enforced. Nothing is changing about your current settings or how your information is processed, however, we have made a few changes. We have updated our Privacy Policy and Cookie Policy to make it easier for you to understand what information we collect, how and why we collect it.