
For decades, financial institution leadership has anticipated, planned for, and adapted to rapidly emerging technologies. AI and generative AI (GAI) are among the latest developments. While formal guidance regarding use of these technologies struggles to keep pace with innovation, regulatory agencies have been actively engaged in the process, issuing reports, recommendations, and requests for information from financial institutions and other interested parties.
Below are a few highlights from recent regulatory agency activity that may help in your financial institution’s board discussions and decisions.
June 2023: Chatbots in consumer finance (Source: CFPB)
Over 100 million consumers are expected to interact with bank chatbots by 2026 as financial institutions increasingly adopt this cost-savings function to aid in their customer service delivery. Chatbots, which simulate human-like responses without actual human interaction, have capabilities that range from simple rules-centric responses based on ladder logic and keywords to more tailored responses that are based on real customer conversations and chat logs, and then trained to evolve through algorithms.
Read CFPB chatbot guidance > here.
March 2024: Managing Artificial Intelligence: Specific Cybersecurity Risks in the Financial Services Sector (Source: Department of the Treasury)
This report from the Department of Treasury recommends several AI best practices that organizations in the financial services sector can use to reduce cybersecurity risks and operate in a safe, sound, and fair manner. Recommended actions include:
- Embed AI-risk management in enterprise-risk management.
- Identify AI risks specific to the institution’s existing controls and its use of AI.
- Assign AI risk management responsibility to a single lead or to an existing executive position, such as chief technology officer.
- Proactively approach data acquisition, privacy, and security to understand data origination and build a comprehensive inventory and mapping framework.
- Expand third-party vendor due diligence by asking vendors in-depth questions about their AI use and integration with data retention and privacy policies.
May 2024: 2024 National Strategy for Combatting Terrorist and Other Illicit Financing (Source: Department of the Treasury)
This national strategy outlines the U.S. government’s goals, objectives, and priorities to disrupt and prevent illicit financial activities amid ever-changing global geopolitical and economic conditions. It offers an evaluation of anti-money laundering and other efforts to counter the financing of terrorism (AML/CFT) threats. It also suggests how a financial institution’s deployment of machine learning and generative AI can strengthen AML/CFT compliance through the rapid analyzation of large datasets. Developed in collaboration with the FDIC, FRB, NCUA, OCC, SEC and other agencies, the report notes that further study is needed to understand the extent and application of AI to identify patterns, risks, and trends that threaten the financial system.
Read the Strategy > here.
June 2024: Request for Information on Uses, Opportunities, and Risks of Artificial Intelligence in the Financial Services Sector (Source: Department of the Treasury)
Noting that AI powers many non-bank companies’ capabilities and services, this RFI invited financial institutions — which it defines as any company that facilitates or provides financial products or services — to provide their perspective on the opportunities and risks that AI developments and applications present within the sector. Although the comment period closed in August 2024, and further action is pending, more than 100 comments are available for viewing. View comments > here.
August 2024: Response to RFI (Source: American Bankers Association)
The ABA highlighted the distinction between traditional AI (which responds to specific data inputs) and GAI (which learns from data inputs and makes decisions and predictions to generate synthetic content). It also requested that regulators apply consistent guidance and requirements to banks and non-banks offering financial services. Moreover, the ABA stressed the importance of transparent and congruent customer disclosures regarding information collection, storage, sharing, and use.
Read the ABA response > here.
Want to offer more perspective and actionable steps to guide your board members in addressing AI? Click here.
The regulatory landscape for financial institutions’ use of AI continues to progress. Contact your Rehmann advisor for a personal consultation or contact Joe Sarnicola at [email protected] or 616.975.4100; or Jessica Dore, CISA, at 989.797.8391 or [email protected].