Site icon Wonderful Engineering

These Bank Robbers Used Deepfake To Imitate Boss’ Voice And Siphon Off $35 Million

3d rendering head voice recognition system, blue ground

Excess of everything is bad…

Not long ago, Justin Bieber fell prey to a deepfake version of Tom Cruise and now it seems like the technology is even being used to commit heists. A bank in the United Arab Emirates was fooled by criminals using AI voice cloning which ended with them stealing $35 million dollars. Deepfake technology is getting more advanced day by day and dangerous as well…

According to court documents in Forbes’ custody, the robbers used a deepfaked voice of the company executive to fool the bank manager to transfer millions of dollars to their possession in early 2020. The plan of action was to use the fake voice of the executive to say that the company was about to make an acquisition and needed the money to do so which the bank manager thought was of the real executive as he had worked with him before and believed the fake voice. He then authorized the transfer, right into the criminals’ account without them having to rob a bank physically themselves.

The UAE has now requested American investigators to help trace the estimated $400,000 dollars that went into US-based accounts. Dubai investigators believe that the heist involved at least 17 people using “deep voice” tech and sent money to different banks around the world.

Turns out, this isn’t the first time that deepfake technology has been used for illegal purposes instead of providing benefits for humanity. In 2019, criminals used AI to try and impersonate an executive’s voice to steal $243,000, according to The Wall Street Journal. The CEO of a UK-based energy firm thought he was speaking to the chief executive of the firm’s German parent company, who asked him to send the fund to a Hungarian supplier.

Scams using deepfake technology and artificial intelligence is becoming a new concern for companies as there are not many cybersecurity tools that are equipped to deal with spoofed voice recordings but looks like now companies would need to invest in products that can detect deepfake recordings.