Banks see benefits of AI in regulatory compliance

 

AI helps institutions comply with confidence 

 

It’s no wonder regulatory compliance is cited as one of the top concerns of banking leaders right now. Regulatory requirements for financial institutions are increasing, adding time and complexity to compliance management. But, with the support of AI, institutions are finding opportunities to improve efficiencies and better prepare for compliance.

Headshot of Paul Mack

“The pressure and cost to comply with regulations on a bank’s compliance management system and team can lead to stress, burnout and human error.”

Leslie Watson-Stracener

Managing Director and Regulatory Compliance Capability Leader
Grant Thornton Advisors LLC

 

“The regulatory burden and cost to comply is only growing, which leaves banks doing more testing and monitoring with the same amount of resources,” said Grant Thornton Managing Director and Regulatory Compliance Capability Leader Leslie Watson-Stracener. “This type of pressure on a compliance management system and team can lead to stress, burnout and human error.”

 

When that stress, burnout and human error add up, steps in the financial institutions’ control design and assessments processes can be overlooked, leading to consent orders that require remediation.

 

“When financial institutions have to remediate consumers, it becomes costly. Not just because of the civil money penalties, but also through reputational harm, among both consumers and investors,” Watson-Stracener said.

 

 

 

More regulations mean more controls

 

With all regulatory requirements, banks are evaluated on their internal controls to ensure they’re operating safely, soundly and in compliance with applicable regulations. To prepare for external evaluation, institutions should conduct internal regulatory testing and monitoring.

 

To evaluate if their internal controls are functioning effectively, an institution first assesses the processes they have in place with a Risk and Control Matrix (RACM). A RACM evaluates and verifies that the controls are sufficiently mitigating risks, preventing fraud and ensuring compliance in areas like Anti-Money Laundering, Know-Your-Customer, data privacy and information security. This process also determines when a control is weak or if there are exceptions to it.

 

Weak controls can be spotted if they have:

 

  • A lack of segregation of duties: If an individual has access to multiple functions within an organization, it can increase the risk of manipulation or fraud.
  • Inadequate monitoring and reporting: This can lead to undetected fraud or errors.
  • Weak password policies: Insufficient security measures can expose institutions to cyber threats and unauthorized access.
  • Insufficient training and awareness: Employees who aren’t trained may unintentionally violate policies, leading to non-compliance.
  • Lack of oversight: Employees who operate without checks can misuse resources or risk fraudulent activity.
  • Missing documentation: If necessary documentation supporting financial transactions isn’t available, it’s likely the control isn’t doing its job.

 

For most financial institutions, these controls are manual, meaning they’re dependent on human intervention and decision-making (which takes time). An added challenge is that banks often need to create unique control processes because every institution handles these processes differently.

 

You might think incorporating AI into compliance management would only add more risk. But, when institutions work with partners to strategically leverage AI into very specific parts of the control design and assessment process, they can gain greater efficiencies for their staff and gain confidence in their compliance processes.

 

 

 

AI brings new benefits to control design assessment

 

Instead of relying exclusively on manual controls, automated controls (with the help of AI tools) can help monitor risks and automate parts of the control design and assessment process that typically rely on humans.

 

“AI tools are useful in creating and testing Compliance Management System (CMS) programs because they can quickly match the most recent guidance provided by regulators to the bank’s CMS plan and monitoring routines and ensure they align with any new or updated regulations,” Watson-Stracener said.

 

For example, Grant Thornton Advisory Services Senior Manager Wes Luckock explained, a generative AI tool that is fine-tuned with an organization’s risk definitions and compliance requirements can support the control design process.

 

The tool can help identify potentially missing risks, as well as missing controls that can be leveraged to further mitigate those risks — “quickly getting us a shortlist of recommendations for improvement, performing that control design assessment for us,” he said. "We can upload notes and a transcript from a meeting with the stakeholder who owns the control process, along with the risk and control descriptions — AI evaluates all that documentation to come to a conclusion as to whether or not the process is designed effectively for that risk as it’s intended. If it’s not, the AI model will make recommendations for improvement.”

 

This usage of AI frees up time for staff to focus on other priorities. As financial institutions grow and the responsibilities of their staff increase, outsourcing the control design work in this way will enable staff to focus their energy elsewhere. 

 

Example regulations where AI language models can support the compliance process include:
  • Home Mortgage Disclosure Act (HMDA): A generative AI tool supports transactional testing by utilizing AI to identify the exceptions to the known answer. For example, HMDA data requires an institution to identify if the mortgage loan has a prepayment penalty. If an institution doesn’t offer a mortgage loan product with a prepayment penalty, the data field should always populate as “N/A.”
  • Truth in Lending Act (TILA): AI can also support transactional testing in compliance with TILA, intended to protect consumers in credit transactions. In this case, AI could automatically answer where the fee is designated as a prepaid finance charge. If at a specific institution the answer is always “yes,” there is a fee, or “yes” when applied to a specific product, AI can automatically determine that at a much more efficient pace than a human manually entering the data.
  • Flood Disaster Protection Act: If an institution’s flood certificate identifies the property address is within a Special Flood Hazard Area, the mortgage loan should always have flood insurance coverage – something the AI model can quickly assess, determine and apply to the data set.

 

 

 

Applying AI in compliance is just the beginning

Wes Luckock

“Across the business cycle, AI will be coming into play in an end-to-end manner. It’s not just going to be a couple tasks throughout the cycle — it’s going to be the entire cycle.”

Wes Luckock

Senior Manager, Advisory Services
Grant Thornton Advisors LLC

 

Using AI to support very specific aspects of your regulatory compliance work is just the tip of the iceberg, Luckock said, “Across the business cycle, AI will be coming into play in an end-to-end manner. It’s not just going to be a couple tasks throughout the cycle – it’s going to be the entire cycle.”

 

To prepare for an AI-enabled future of banking, Luckock noted, institutions should dial in their data management. For many institutions, data is one of their biggest challenges. But an AI-forward future will support data issues that many institutions struggle with.

 

“Every institution tracks their data differently and in varying formats, and when data is unstructured and messy, it’s difficult to leverage technology to its fullest extent,” Luckock said. “We’ve been able to build AI solutions that can leverage and handle that wide range of unknowns from a data standpoint to really solve for that challenge and ensure data is accurate and clean.”

 

But as much value as AI tools provide, human intervention isn’t going anywhere. As institutions explore the possibilities of AI and other automated technology, even when working with a third party to support the work, it’s important to be mindful of ethics.

 

“Always make sure your board has oversight of your AI practices,” Watson-Stracener said. “And test your results. Even when an AI tool may be doing the heavy lifting of analyzing data or comparing information, you should still build sampling and checking for anomalies into your process.”

 
 

Contacts:

 
 
 
 

Our banking featured industry insights