28 Finance Monthly. Bank i ng & F i nanc i a l Se r v i ce s How fraudsters are using AI In the first half of 2022 alone, UK Finance revealed that criminals stole a total of £609.8 million through authorised and unauthorised fraud and scams. The same report shows that these figures were even higher during the pandemic. It’s therefore no surprise that the financial services industry is constantly looking for ways to strengthen security protocols and protect consumers, with tools such as two-factor identification and biometric authentication becoming the norm. However, professional fraudsters are also seeking to find new ways to exploit consumers - something the finance industry needs to respond to quickly. Just recently, an investigation by Which? revealed that criminals are increasingly intercepting onetime-passcodes delivered via sms - putting customers at risk. They identified further weaknesses as insecure passwords, lax checks on new payees and vulnerable log-in processes. While fraudsters recognise that they are unable to readily fake an account holder’s fingerprints or face, they are now turning to AIenabled social engineering tools to generate a brand new or synthetic identity, creating fake bank accounts and then committing fraud. The aforementioned ChatGPT has taken the internet by storm, with the AI-powered chatbot Fraud is always at its most virulent during economic downturns and crises. In fact, in the first nine months of 2022, over 309,000 cases were recorded to the National Fraud Database, a 17% rise compared to the previous year. This increase was mainly driven by the rise in false application and identity fraud, up by 45% and 34%, respectively. It’s perhaps no surprise that fraudsters will see new AI-powered technologies, such as ChatGPT, as a golden ticket to exploit vulnerable people. Others may turn to opportunistic fraud if the resources are available to them, highlighting the need for organisations to be investing in both predictive and preventative technology if they are to protect consumers.
RkJQdWJsaXNoZXIy Mjk3Mzkz