Compliance must embrace – and understand – AI

Compliance must embrace – and understand – AI

Compliance teams are overstretched. It’s become imperative they find ways to leverage technologies to become leaner, more effective, and better able to handle increasing demands. But they’re not alone in these efforts; the most recent OCIE risk alert indicates that organizations are also responsible for compliance programs that are sufficiently supported with both staff and technology.

As we’ve discussed before, an over-reliance on manual functions means compliance teams are overwhelmed by low/moderate risk issues. Technology and automation have to be considered as part of the equation so that teams can focus on the riskiest issues that matter most to the business.

As technology gets more intelligent, an opportunity arises in artificial intelligence (AI) as a catalyst to enhance the efficiency of a program. As we’ve mentioned, this can lead to a more mature, impactful compliance program and increased trust throughout the organization.

However, as programs mature and manual processes shift into automation, compliance teams will need to understand automation more and more. AI is an important tool, but at some point, compliance will be asked to explain how they supervise and test these tools to know they’re functioning as designed and expected.

At its core, AI is designed to monitor a data set and when a logical trigger is set off, to translate that information into an action. In some instances, that translation is clear and easily understood. But in other situations, especially when the way the AI translates between data sets and actions is covered under a “Black Box” due to intellectual property concerns, it makes explaining it to a regulator more difficult.

As FINRA wrote in its June 2020 report on AI and again reiterated during its November Conference on AI, a compliance professional needs to understand how the AI they are implementing aligns with regulatory expectations. These steps include a documented understanding of the data set-to-action translation and a method to regularly test the system to validate it meets legal and regulatory requirements. When the algorithm informing your AI is hidden in a “Black Box”, this can prove difficult.

It might be time to evaluate your firm’s use of AI in its supervision policies. If in the course of your review, you have any questions on AI and how to prepare for a regulatory audit feel free to reach out to your Hearsay account team to help.

Bill Simpson
By day, Bill Simpson is Hearsay’s Director of Compliance, helping deliver technologically compliant solutions to Hearsay’s global customer base. Once he logs off (or sometimes before he logs on!), he can be found tending to his BBQ pit, playing with his two boys, or enjoying collaborative storytelling via tabletop RPGs.

Interested in reading more?

Interested in reading more?