A scam that saw AI harnessed to mimic to voice of a CEO in order to con a senior employee is a sign of an oncoming rise in machine learning enhanced crime, according to a leading cybersecurity expert.
The scam was carried out in March but came to light in the last few days. It saw cybercriminals use AI to mimic the voice of the CEO of a German multinational to con the chief of a UK-based energy firm wholly owned by the company into transferring a large sum of money.
Through the combined convincing voice and a sense of urgency, the AI caller convinced the target that €220,000 needed to be transferred within the hour, to what he believed was a Hungarian supplier. He complied.
Machine learning crime: the choice of future cybercriminals?
The use of machine learning to commit this type of crime is – at present – very rare. But according to Jake Moore, cybersecurity specialist at ESET, it is set to grow in the future.
“I predict that we will see a huge rise in machine-learned cyber-crimes in the near future,” said Moore.
“We have already seen DeepFakes imitate celebrities and public figures in video format, but these have taken around 17 hours of footage to create convincingly.
How well do you really know your competitors?
Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.
Thank you!
Your download email will arrive shortly
Not ready to buy yet? Download a free sample
We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form
By GlobalData“Being able to fake voices takes fewer recordings to produce. As computing power increases, we are starting to see these become even easier to create, which paints a scary picture ahead.”
How companies can protect themselves from AI voice scams
While such machine learning enabled crime is very hard to catch, particularly given the convincing nature of voice imitation, Moore argues that there are things companies can do to minimise their risk of falling prey to such a scam. And this includes understanding what is possible and changing verification processes to protect against it.
“To reduce risks it is imperative not only to make people aware that such imitations are possible now, but also to include verification techniques before any money is transferred,” he said.
“Two-factor authentication is another powerful, inexpensive and simple technique that adds an extra layer of security to protect your money going into a rogue account.
“When being called about a money transfer, particularly of large sums, check the number calling and ask to call back. Do so using a number in your address book, rather than hitting the ‘call back’ option in your call history.”
Read more: Phishing scams account for half of all fraud attacks, according to RSA