Fighting FinCrime in financial services: optimising the balance between innovation and compliance

Senior leaders from across the financial services industry gathered to discuss data-driven strategies for optimising financial crime prevention and compliance.

As financial crime grows increasingly sophisticated, financial institutions face mounting pressure to combat threats efficiently while adhering to expanding regulatory requirements and tightening budgets.

Some financial institutions are starting to leverage emerging technologies like artificial intelligence and machine learning to improve financial crime detection, while others are turning to automation to streamline processes like know your customer (KYC), anti-money laundering (AML) and sanctions screening.

Many find themselves having to balance robust customer onboarding and due diligence with a smooth customer experience. They also have to take into account the ethical implications of using customer data for compliance purposes and evaluate costeffective ways to meet expanding regulatory requirements.

At a roundtable event in central London hosted by FStech and SmartSearch, senior leaders from across the financial services industry met to discuss the challenges they face in fighting FinCrime in an era of increasingly complex compliance obligations, intelligent automation and advanced analytics.

Streamlined information

The evening started by looking at which untapped data sources have the most potential for improving financial crime detection and how FSIs can better leverage them. The head of compliance at an insurance company said that there needs to be an improvement in the way corporate due diligence is conducted. “We do very bespoke deals and getting that information and making it streamlined is one of our biggest roadblocks,” they said. “It would be so much easier if you could put the company’s name into the underwriting system and immediately get all the publicly available information.”

They added that there is also an issue with the quality of data collected. Each insurance firm has its own version of a KYC form, they said, and they all collect different data in different ways. The risk director at an online payments company said that there needs to be better lines of communication with banking partners. If there is a problem, some companies find out too late.

The director at an American bank agreed and said they had struggled with building good models to track money laundering and other suspicious activity. “This information is not always communicated well across different organisations or different countries,” they said. “I understand that some information is restricted, but if you want improvement you have to make a concentrated effort to share this information so that data is transparent.”

The director of financial crime at a banking services provider said that it could be useful to get information about underlying portfolios. When manually trying to get information from customers for monitoring purposes using templates, they find that different customers respond in different ways and it’s hard to make use of the data when it is in different formats.

The chief technology officer at a challenger bank said that it can be hard to establish the identity of a customer and that the bank is putting a lot of effort into identifying customers through different channels. “We have this portal there and we have created it, but the adoption levels are low and we are trying to improve that and automate the documents and emails that go with it,” they added.

Sharing information

James Birks, enterprise business development manager at SmartSearch, said that it is clear FSIs need to collaborate. “If we could get government bodies to communicate and share information, it would make everyone’s lives easier,” he said. “Everyone seems to have their own data sets. There are some key things we could standardise and unify across countries that would help.” The head of financial crime and controls at a UK bank said that FSIs needed to leverage both internal and external data. While there is a lot to process, AI and automation could help FSIs with the job, they continued.

The head of financial institutions at a national bank said that one underrated source of information is the search engine Google. “Part of my coverage is Swiss private banks and I have to ask them hard questions, such as which regulator did they come up against recently,” they said. “Through Google, I can see that a raid happened, and people went into their offices.”

The head of counter fraud and financial crime at an insurance company, said that the most powerful resource for FSIs is the human brain and some firms are underutilising it. “Data is not data, it’s actually information,” they said. “What we’ve done is go down the road of spreadsheets and formatted data, but we’ve forgotten how to turn that into information.”

Staying one step ahead

Another senior leader agreed and said that fraudsters are usually one step ahead of those trying to catch them. “Financial crime is committed by humans and we have to watch out for the likely scenarios,” they said. The director of FinCrime at a bank said that making it about collecting data instead.” They continued: “I’ve worked in legacy institutions with customers that were onboarded in the 80s when technology wasn’t an option. Trying to convert all of that into meaningful data relies heavily on the human brain and it takes a lot of effort to join up the dots and connect them for KYC purposes and AML investigations.”

The director of financial crime compliance at a bank said that it can be challenging to get hold of and analyse the data correctly, adding that there is often not enough information supplied to them when investigating financial crime. “There’s not always enough guidance and information from the regulator,” they continued. The head of financial crime and controls at an international bank made the point that simply holding information is not enough to prevent fraud. The information has to be of high quality and FSIs should go out of their way to find it from organisations like Companies House, they added.

The risks of AI

The discussion moved on to looking at what the risks and opportunities are for firms in using newer technologies such as AI to prevent fraud. A senior leader working at a payments company said that AI could be used to make decisions quicker rather than relying on new technology to make the decisions. “That’s where we get into big trouble,“ they warned. “The real use cases for AI won’t emerge for another few years, but everyone is just throwing money at it.”

They continued: “Look at cloud, it has taken so long for the regulator to get comfortable with it – how long is it going to take for them to allow AI to make decisions?” They added that AI is currently “not worth the risk,” suggesting that management at FSIs are traditionally risk averse and unlikely to take on the extra responsibility needed to implement systems based on AI.

Ethical considerations

The director at an international bank raised the point that there are ethical considerations to take into account. “You can train your AI on a particular thing, but the algorithm could start to discriminate against people and then what do you do?” they asked. “People are desperate to use AI to save money but we need to design it properly and this will take time.” The deputy MLRO at an investment firm said FSIs need to put AI into context and be aware of how AI is working. “AI is a million more times complicated than just data sets, I’m not against it but firms need to be aware of potentially discriminatory practices,” they said.

Another senior leader said that it is essential for FSIs to have rigorous AI policies in place, adding that because some AI systems are publicly accessible it is easier to go “off road” when training them. “You can create all sorts of issues from a data protection perspective,” they said. “You need controls in place, but it’s essentially already too late and there has been stuff that has already slipped through the net.”

The head of financial institutions and wholesale credit risk at a British Bank said that the crimes committed are the same, but that the criminals utilise technology to their advantage. “It’s a game changer,” they said. “They share information on the dark web.”

One head of financial crime and controls believes that FSIs can harness technology to their advantage. “Machine learning is very good at spotting patterns and anomalies,” they said. “If you can characterise the pattern of normal transactions or behaviour to a particular type of fraud, it is a very effective way of focusing human effort on things that are high risk.” They added that this has already been happening in the cyber security space and with transaction monitoring, adding that it could be extended to other use cases.

Maintaining trust

The head of information technology at a bank said that FSIs could collaborate on the development of AI systems, but there would need to be a lot of trust involved. It is likely that many institutions would like to remain anonymous despite the need to collaborate in real time, they explained.

Another senior leader made the point that there could be political implications. “If some bad actor steals your ID and gets your name put on a list, how do you get off it? If someone acted as a money mule and tried to open a bank account in your name, that could mess up multiple things,” they said.

The director of financial crime at an international bank agreed that AI poses many risks. They cited an instance where AI had been used to copy a person’s voice during a conference call and managed to steal a significant sum of money before the call had ended. The head of financial institutions and wholesale credit risk at a national bank said that while so-called hallucinations in AI could be a problem and make mistakes, FSIs might be taking the wrong approach.

“Things like ChatGPT are made to create outputs that are convincing to humans, they are not made to think like we are,” they said. “As a credit person, I am very scared because at some point I won’t be able to tell if AI makes a 30-page report or my analyst makes a 30-page report. “That doesn’t mean the AI is thinking. Hallucinations are where AI is trying to convince humans it is real – but that was its purpose in the first place, and it is extremely complicated.”

Closing the evening Zowie Lees-Howell, VP of enterprise sales at SmartSearch, remarked that it is encouraging that so many FSIs are striving to create a good customer experience as well as meet the regulatory requirements. “There are lots of disparate data sources and legacy platforms out there,” she concluded. “Firms have to adapt and take in all of that data at the right time and it is encouraging to see them using a range of tools to do this.”

Discover more by visiting SmartSearch.com

Share post