Goto

Collaborating Authors

The Cutting Edge of AI Cyber Attacks: Deepfake Audio Used to Impersonate Senior Executives

#artificialintelligence

There is a great deal of public concern about deepfakes, most of it centered on the ramifications of being able to quickly and easily face-swap videos. That concern is certainly well-founded, but it may be obscuring an even more immediate threat – deepfake audio. Voice-swapping has already been put to use in at least a handful of artificial intelligence (AI) cyber attacks on businesses, enabling attackers to gain access to corporate networks and convince employees to authorize a money transfer. The primary use of deepfake audio is to enhance a very common type of attack – business email compromise (BEC). A business email compromise attack usually begins with some sort of phishing to gain access to the company network and reconnoiter the payment systems.


Forget email: Scammers use CEO voice 'deepfakes' to con workers into wiring cash ZDNet

#artificialintelligence

Criminals are using AI-generated audio to impersonate a CEO's voice and con subordinates into transferring funds to a scammer's account. So-called deepfake voice attacks could be the next frontier in a scam that's cost US businesses almost $2bn over the past two years using fraudulent email. The Wall Street Journal reports that the CEO of an unnamed UK-based energy company thought he was talking on the phone with his boss, the CEO of the German parent company, who'd asked him to urgently transfer €220,000 ($243,000) to a Hungarian supplier. However, the UK CEO was in fact taking instructions from a scammer who'd used AI-powered voice technology to impersonate the German CEO. It's the voice equivalent of deepfake videos that are causing alarm for their potential to manipulate public opinion and cause social discord.


Israel sees cyber attacks by voice impersonating of senior staff - Express Computer

#artificialintelligence

The Israel National Cyber Directorate (INCD) has issued a warning of a new type of cyber attack, using artificial intelligence (AI) technology to impersonate senior company executives. In this method, instructions are given to the companies staff members to perform transactions such as money transfers, as well as malicious activity on the company's network. Recently, reports on cyber attacks of this kind were received at the operations centre of the INCD, Xinhua news agency reported. The new offensive is of the business email compromise (BEC) type -- frauds by email against commercial and government organizations to motivate employees using social engineering methods to act for the attacker"s benefit. The most common types are phishing messages and an invoicing fraud in which the attacker impersonates the vendor, submits an invoice to the company and tries to motivate an employee under time pressure to make a bank transfer, provide information or allow access to the company"s network.


Fraudsters deepfake CEO's voice to trick manager into transferring $243,000

#artificialintelligence

It's already getting tough to discern real text from fake, genuine video from deepfake. Now, it appears that use of fake voice tech is on the rise too. That's according to the Wall Street Journal, which reported the first ever case of AI-based voice fraud -- aka vishing (short for "voice phishing") -- that cost a company $243,000. Don't miss Hard Fork Summit in Amsterdam In a sign that audio deepfakes are becoming eerily accurate, criminals sought the help of commercially available voice-generating AI software to impersonate the boss of a German parent company that owns a UK-based energy firm. They then tricked the latter's chief executive into urgently wiring said funds to a Hungarian supplier in an hour, with guarantees that the transfer would be reimbursed immediately.


Fraudsters deepfake CEO's voice to trick manager into transferring $243,000

#artificialintelligence

It's already getting tough to discern real text from fake, genuine video from deepfake. Now, it appears that use of fake voice tech is on the rise too. That's according to the Wall Street Journal, which reported a case of voice fraud -- aka vishing (short for "voice phishing") -- that cost a company $243,000. Find out at TNW's Hard Fork Summit In March, criminals sought the help of commercially available voice-generating AI software to impersonate the boss of a German parent company that owns a UK-based energy firm. They then tricked the latter's chief executive into urgently wiring said funds to a Hungarian supplier in an hour, with guarantees that the transfer would be reimbursed immediately.