It's already getting tough to discern real text from fake, genuine video from deepfake. Now, it appears that use of fake voice tech is on the rise too. That's according to the Wall Street Journal, which reported a case of voice fraud -- aka vishing (short for "voice phishing") -- that cost a company $243,000. Find out at TNW's Hard Fork Summit In March, criminals sought the help of commercially available voice-generating AI software to impersonate the boss of a German parent company that owns a UK-based energy firm. They then tricked the latter's chief executive into urgently wiring said funds to a Hungarian supplier in an hour, with guarantees that the transfer would be reimbursed immediately.
It's easy enough to forge a signature for fraudulent purposes. However, until recently, some things--like our voices--have been distinctive and difficult to mimic. A new kind of cybercrime that uses artificial intelligence and voice technology is one of the unfortunate developments of postmodernity. You can't trust what you see, as deep fake videos have shown, or what you hear, it seems. A $243,000 voice fraud case, reported by the Wall Street Journal, proves it.
Criminals are using AI-generated audio to impersonate a CEO's voice and con subordinates into transferring funds to a scammer's account. So-called deepfake voice attacks could be the next frontier in a scam that's cost US businesses almost $2bn over the past two years using fraudulent email. The Wall Street Journal reports that the CEO of an unnamed UK-based energy company thought he was talking on the phone with his boss, the CEO of the German parent company, who'd asked him to urgently transfer €220,000 ($243,000) to a Hungarian supplier. However, the UK CEO was in fact taking instructions from a scammer who'd used AI-powered voice technology to impersonate the German CEO. It's the voice equivalent of deepfake videos that are causing alarm for their potential to manipulate public opinion and cause social discord.
In the first known case of successful financial scamming via audio deep fakes, cybercrooks were able to create a near-perfect impersonation of a chief executive's voice – and then used the audio to fool his company into transferring $243,000 to their bank account. A deep fake is a plausible video or audio impersonation of someone, powered by artificial intelligence (AI). Security experts say that the incident, first reported by the Wall Street Journal, sets a dangerous precedent. "In the identity-verification industry, we're seeing more and more artificial intelligence-based identity fraud than ever before," David Thomas, CEO of identity verification company Evident, told Threatpost. "As a business, it's no longer enough to just trust that someone is who they say they are. Individuals and businesses are just now beginning to understand how important identity verification is. Especially in the new era of deep fakes, it's no longer just enough to trust a phone call or a video file."
Fraudsters are always looking for new ways to cheat someone out of money. A report claims that a company CEO was tricked by scammers who faked the voice of the parent company CEO to get the executive to transfer $243,000 to an external account. The story claims that in March, criminals used commercially available voice-generating AI software to impersonate the CEO of a German energy company with a division based in the UK. The thieves and their deepfake corporate CEO tricked the real CEO of the British energy company into transmitting funds into the claimed account of a Hungarian supplier. Guarantees were given that transfer would be reimbursed immediately.