AI Tries To Even the Playing Field for Financial Crime

Image credit: iStockphoto/Gorlov

Financial crime is having a renaissance moment.

The sudden switch to digital channels during the early days of the pandemic and extensive remote working also saw a surge in criminal activities.

Criminals explored hidden gaps in the security mantle, preyed on health and financial anxiety, and social engineered at a time when humans lived isolated.

This rampant rise in financial crime resulted in two different groups of criminals.

“The vast majority who get caught are opportunistic criminals, perhaps an internal fraudster or a loan approval officer who is taking backhanders,” says Guy Sheppard, chief operations officer of financial services, Aboitiz Data Innovation (ADI).

The second group—the organized criminal gangs—is more nefarious. It is where AI is beginning to make its mark.

Inside the mind of a financial criminal

To fight financial crime, organizations need to start understanding who they are.

“[Financial] criminals are incredibly sophisticated; their job is to look at inconsistencies and weaknesses in different systems,” says Sheppard.

COVID-19 was a god-sent for these criminals. While many banks and financial institutions already had checks and balances to identify financial crime, the sudden shift to digital created a mountain backlog in financial crime alerts.

A key reason for the backlog is that many institutions were not set up to review "highly sensitive information remotely." This also led to “huge peaks in help desk scams” while seeing many crimes go unnoticed as banks had to balance for customer experience, explains Sheppard.

During the pandemic, the threat surface grew exponentially as institutions widened the online and mobile app payment channels and moved into real-time payments. It stressed anomaly detection systems and quickly depleted already stressed-out resources.

The syndicates also explored new grey areas in financial crime. Take the proliferation of romance or investment scams, for example.

Sheppard notes that such crimes are highly divisive in the industry. Reason? “Is it the bank's ethical responsibility to get in the way?” he questions, adding that it is difficult for financial institutions to detect coercion when you give away your money willingly.

Essentially, the financial crime landscape is incredibly dynamic and challenges the rules-based systems many financial institutions use to root it out. And with the advent of generative AI and other advances, those rules are quickly challenged.

In response, financial institutions have ramped up their defenses. But, the real victims are those who legitimately need financial assistance or services and do not get it because they are deemed too risky or do not have the assets to back them.

Enter AI.

Banking on intelligence

Over the past few months, ADI has been offering new proofs of concepts and actual products on how AI can be part of the financial crime-fighting system and still help the unbanked or those in need of financial assistance.

“Increasingly what we're seeing is these rules-based systems being complemented by downstream AI and machine learning models, and that's what we've been doing with our banks in the Group,” says Sheppard.

For example, its joint non-mortgage fintech lending project with the UnionBank of the Philippines (UnionBank) and the Smith School of Business at Queen’s University in Canada on anti-discrimination law, artificial intelligence, and gender bias was recently listed in the Global Top 100 list of Artificial Intelligence (AI) solutions by the International Research Centre on Artificial Intelligence (IRCAI) under the auspices of UNESCO.

The study comes from others that showed that the consumer lending market process is stacked against women and minorities. ADI used a use case in non-mortgage fintech lending to engage in explainable and responsible AI efforts and address the biases.

The study investigated whether excluding the use of gender information in assessing creditworthiness hurt or helped the groups they are supposed to protect. It ultimately revealed that using gender-related data results in a significant decrease in gender discrimination and increased profitability for the firm.

The project offers a guide in improving anti-discrimination laws to ensure that ML models foster a fairer and more inclusive system.

ADI also uses AI to detect mule accounts, which Sheppard notes as the most challenging thing to do in fighting financial crime.

"The challenges with mules are twofold. First, we wanted to get better at detecting new accounts being opened quickly virtually. And that's where we would score the identity documents they would provide,” says Sheppard. This information was complemented by lessons from their alternative lending model highlighted above.

“The biggest challenge is called collusion,” says Sheppard. It’s where you have people that are under significant financial pressure that get approached by syndicates to open accounts or funnel transactions.

At the same time, financial criminals do know they will be monitored. “So, the first thing that any criminal would do if they took over my account would be to change my cell phone number, email, or password. [Identifying such changes] is a much more difficult piece,” says Sheppard.

Detecting mules takes a lot of resources. And not everyone is a large bank with deep resources. So, ADI is using AI instead.

"And I think AI and machine learning become so critical because there are limitations to what rules-based systems can do. Also, if you have human analysts, imagine if you've got an alert workload of 1,000 minutes waiting to be opened in your inbox day in and day out. And this is now year four of your life as an analyst," says Sheppard.

The Mules Account Detection solution, done in collaboration with UnionBank, uses a two-level approach. The first is at onboarding, where the AI model identifies the probability of a falsely set up mule account. The second is when the account is flagged to further assess account behavior and the likelihood of it being a mule. The AI-driven anomaly detection model continuously learns and improves to generate scored alerts of mule activity.

The solution encompasses workflow, alert and case management for ease of deployment. It has configurable detection, with an improved accuracy rate in predicting the risk of mule accounts—good enough to win the Pitch! Regtech award for the Fraud & Financial Crime category hosted by Regulation Asia.

At the same time, Sheppard believes responsible AI principles and practices should be backed into MLOps. That’s because of explainability and model drift.

“We now have an entire toolkit in terms of how code should be created and a way of stress testing for different levels of bias, be it gender, ethnicity, stage of life, etc.," says Sheppard.

"Make sure that there is explainability to how model decisions are made, and also ensuring that we are fundamentally making ethical decisions," he adds.

Police still not AI-ready

Enforcement action is a significant gap in today's fight against financial crime. It's easy for enforcement to act when the offense is conducted within the country's borders. But often, financial crime is not bounded by borders.

This requires law enforcement agencies to collaborate and share data as they look to catch financial criminals operating across borders. However, politics and the involvement of state actors can complicate this picture.

Sheppard notes that suspicious transaction reporting (STR is growing “at an average of 15 to 20%.”

Less than 10% of all STRs are actually investigated. “And 1% of criminal assets are seized; that's the reality. And to put that into perspective, financial crime globally is about two to 5% of global GDP,” continues Sheppard. It is 8% in Asia.

“A lot of financial crime is increasingly cross border. So it becomes an issue of jurisdictions. So, we know where the money is. The money is frozen but in limbo," explains Sheppard.

However, there is hope. Sheppard is seeing more cross-border collaboration between regulator bodies and enforcement agencies.

“Where I hope they evolve is that there are then best practices put out for recouping funds or assets from another jurisdiction, and it needs to be a simplified process," says Sheppard.

It's also one area where AI can assist but will not solve.

Winston Thomas is the editor-in-chief of CDOTrends and DigitalWorkforceTrends. He’s a singularity believer, a blockchain enthusiast, and believes we already live in a metaverse. You can reach him at [email protected].

Image credit: iStockphoto/Gorlov