Collaborating Authors

Forget email: Scammers use CEO voice 'deepfakes' to con workers into wiring cash ZDNet


Criminals are using AI-generated audio to impersonate a CEO's voice and con subordinates into transferring funds to a scammer's account. So-called deepfake voice attacks could be the next frontier in a scam that's cost US businesses almost $2bn over the past two years using fraudulent email. The Wall Street Journal reports that the CEO of an unnamed UK-based energy company thought he was talking on the phone with his boss, the CEO of the German parent company, who'd asked him to urgently transfer €220,000 ($243,000) to a Hungarian supplier. However, the UK CEO was in fact taking instructions from a scammer who'd used AI-powered voice technology to impersonate the German CEO. It's the voice equivalent of deepfake videos that are causing alarm for their potential to manipulate public opinion and cause social discord.

Fake AI-generated voice of CEO used to defraud energy company - SiliconANGLE


An unnamed energy company has been defrauded of $243,000 by scammers who used artificial intelligence to mimic the voice of its chief executive officer. In an incident detailed by The Wall Street Journal, the scammers used the technology in a phone call with the CEO of a U.K. subsidiary of the unnamed German company to order a transfer of funds to a Hungarian supplier. The U.K. CEO believed that he was talking to his German chief and transferred the funds. Those behind the fraud attempted the trick again, but the U.K. subsidiary became suspicious as staff noticed that the calls were originating from Austria, not Germany. The funds were transferred from a Hungarian bank to an account in Mexico before being transferred elsewhere.

Ai Editorial: Detecting deepfakes to combat identify fraud - Ai


Ai Editorial: Deepfakes supported by AI techniques today are considered to be a growing problem. It is vital to build AI systems that can automated deepfake detection so that risks such as identity fraud can be tackled, writes Ai's Ritesh Gupta Artificial intelligence (AI)-based identity fraud is emerging as a serious issue. Recognition of one's voices and face as a way to validate a person's identity is under scrutiny with the rise of synthetic media and deepfakes. Be it for security-related risks, user privacy concerns or fraudulent transactions, repercussions are being probed at this juncture. Technology to manipulate images, videos and audio files is progressing faster than one's ability to tell what's real from what's been faked.

The Emerging Threats of Deepfake Attacks and Countermeasures Artificial Intelligence

Deepfake technology (DT) has taken a new level of sophistication. Cybercriminals now can manipulate sounds, images, and videos to defraud and misinform individuals and businesses. This represents a growing threat to international institutions and individuals which needs to be addressed. This paper provides an overview of deepfakes, their benefits to society, and how DT works. Highlights the threats that are presented by deepfakes to businesses, politics, and judicial systems worldwide. Additionally, the paper will explore potential solutions to deepfakes and conclude with future research direction.

Is AI-Enabled Voice Cloning the Next Big Security Scam?


A company that specializes in detecting voice fraud is sounding the alarm over an emerging threat. With the help of AI-powered software, cybercriminals are starting to clone people's voices to commit scams, according to Vijay Balasubramaniyan, CEO of Pindrop. "We've seen only a handful of cases, but the amount of money stolen can reach as high as $17 million," he told PCMag. During a presentation at RSA, Balasubramaniyan said Pindrop has over the past year also investigated about a dozen similar cases involving fraudsters using AI-powered software to "deepfake" someone's voice to perpetrate their scams. "We're starting to see deepfake audios emerge as a way to target particular speakers, especially if you're the CEO of a company, and you have a lot of YouTube content out there," he said.