Within the financial industry, communication has several parallels to policing in the U.S., where the Miranda Rights (which you always hear quoted in films) specifically call out that risk, “What you say can and will be held against you in a court of law.”

When it comes to brokers communicating with other brokers or their clients about investment trades, there are several rules. The words (or emojis) and communication channels used to discuss financial transactions can make all the difference. Not only do words and channels matter to the people who are communicating with each other, but they also matter to employers because it is they who are carrying the risk.

The level of risk varies; it all depends on which communication channel was used to convey the message. If an approved channel is used, it’s presumably being monitored. Companies can check the box there and have some reasonable assurance that anything untoward will be discovered through the monitoring process. On the flip side, using non-approved communication channels is generally prohibited by financial firms because those channels aren’t being monitored. And if they’re not being monitored, how do you know if there are any bad actors up to no good? That’s part of the impetus behind Citigroup’s recent move to hire over 4,000 tech workers so that they can move the bank’s institutional clients fully online and therefore be able to better monitor financial activities.

Today’s technology has come a long way with respect to its application for monitoring communication. Some RegTech is even sophisticated enough to correctly assess nuances. But the catch is that the monitored electronic communications channels must be utilized. If people are working around the system and using other vehicles to exchange ideas about financial transactions, then the sophistication of the technology deployed is no longer a factor.

Back in 2014, a New York City broker perpetrated a $5.6 million illegal insider trading scheme by scribbling tips on a napkin passed to a waiter in Grand Central Station. The waiter would then eat the napkins to avoid detection. Even the best artificial intelligence in the world is no match for the natural, biological degradation of evidence.

Let’s take the example of Illicit activities going on in secret, like trading napkins at a busy café outside the walls of the bank or brokerage house. Yes, it is hard to discover market abuse where and when you’re not looking for it. But the JPM Chase non-compliance story of 2022 is a whole other issue the use of unapproved chat channels was happening within their financial institutions right under their noses. Oh, but it gets even better – their executives were largely the ones who used unapproved e-comms channels.

Regulators imposed a whopping total of $200 million in fines levied against Chase for non-compliant communications. Executives seemingly operated with impunity, allowing their staff to converse digitally over WhatsApp – an unmonitored channel. Not only was monitoring absent for nearly two years but no conversations were captured. Record-keeping is an essential component of compliance and financial firms must monitor capture, and archive all communications related to financial transactions. Today, technology exists to automate much of that workflow, including the semantic analysis of the words exchanged to identify potentially non-compliant behaviours.

There are consequences for non-compliance. The Chase executives involved in the WhatsApp scandal were terminated and SEC investigations are ongoing in the search for additional transgressions. However, the use of a non-approved e-comms channel, even if there is no sign of any financial wrongdoing, can be grounds for termination. A Credit Suisse broker was apparently fired in April 2022 – after 28 years with the firm – for using his personal phone and WhatsApp to exchange financial details with his clients. This latest example offers a stark warning to all brokers who are using WhatsApp, Telegraph, Signal, or any other unapproved e-comms apps. And that warning is even louder for the financial firms who employ those brokers as it is the firms who must navigate the risk of that non-compliance.

Natural Language Processing, coupled with AI, empowers RegTech solution providers and their clients with a powerful tech combo. Scanning and searching for obvious words that violate personal safety rights or discriminate against people is the technology of the past. In 2022, by training monitoring systems on extensive volumes of conversational data, the search algorithms developed can actually be refined and evolve into tools of inference. That is, some of these compliance monitoring solutions can predict when a broker is going to go rogue – even before they actually do. Changes in the frequency of communication or who they’re interacting with can trigger an alert. New behaviours like switching language dialects or jumping from one e-comms channel to another in the middle of a conversation thread can be detected and analyzed for illicit intention.

Not to say that people don’t try to cheat the system. Emojis are known to impede the parsing of text data. Those cute little symbols – which are used on Facebook alone more than 5 billion times per daycan make it difficult for monitoring solutions to assess abusive correspondence. But, as we all know, everything is about one-upmanship. Every time that someone figures out a clever new hack to pass the authorities, the RegTech providers are figuring out a way to track the hack.

Bad actors may think that they’re beating the odds, but it’s only a matter of time. Always bet on tech because the tech will win.

Written by Shiran Weitzman, Co-Founder and CEO of Shield

Credit: Eric Sultan