Alert for new scams: fake video calls with voice and image generated with artificial intelligence are arriving

Share This Post

- Advertisement -

Over the past year, advances in artificial intelligence have revealed an unprecedented aspect of lifelike image compositing: making fake video and coherent text. THE voice cloning it also joins the list, and security experts anticipate a new one wave of virtual scams.

- Advertisement -

Voices forged through artificial intelligence still have a long way to go before they are indistinguishable from human voices but, in a video call short-lived, they are capable of fooling even the most experienced.

Especially if the person appearing on the mobile screen asking for a deposit for some fatality (accident, robbery, or emergency), matches the voice and appearance of who you say you are.

- Advertisement -

The rise of this mode has its epicenter in China and is rapidly spreading to other latitudes. Several Asian media reports what looks like a sci-fi chimera: people dispossessed by a digitized avatar.

Calls from digital avatars asking victims for money are growing.

Calls from digital avatars asking victims for money are growing.

For this reason, the cyberspace administration is advising the public through posts on Weibo, the “China Twitter”, to “be cautious about providing biometric information and to refrain from sharing videos and other images on the Internet.”

However, the authorities do not have sufficient resources to stop this type of fraud because it is complicated identify the scammer through calls and because it is a new technology that does not have the jurisprudence to act quickly.

As reported by Xinhua News Agency, a businessman named Wan Guo transferred $3,000 from his account after receiving a WeChat communication from a woman whose face and voice closely resembled his wife’s.

The excuse was that he had been in an accident with his vehicle and had to pay for the damage caused to another driver. To escape the suspect’s ambush, he insisted it be immediate.

“Although there was something artificial about his delivery, it did not occur to me to be suspicious as the facial expressions and even the voice were those of my wife,” the businessman told the Global Times.

An expanding scourge

The danger posed by voice cloning through artificial intelligence.

The danger posed by voice cloning through artificial intelligence.

Investigators in the case found that the scammers knew the wedding customs. They also knew that the woman had a cooking channel on a social network and from there they took captures of her face and voice to plot the deepfake.

Sensing something amiss, the businessman contacted his wife via message after a few moments and she denied him the facts. Guo immediately notified the bank which proceeded to block the operation, preventing the transfer.

Several similar cases appear on the pages of the Washington Post, where they indicate that, according to Federal Trade Commission (FTC) data, this type of fraud is among the second most frequent, with more than 36,000 complaints.

The fear is that some scammers have started using this technology to clone the faces of influencers and streamers, to reclaim products and royalties from the companies that hire them.

In most attempts it is difficult to locate the trap, especially as the urgent tone makes the victim less reflective. The matter is more complicated if the person receiving the call is older and unaware of these technologies.

How deception is generated

Just a year ago, AI-generated images looked like flawed sketches, and now they’re realistic enough to fool millions. Something similar happens with the voice.

Some text-to-speech startups like ElevenLabs or Respeecher, through artificial intelligence, allow it replies any phrase with just a few seconds audio sample, something very easy to get on social networks.

Voice generation software analyzes the pattern that makes a person’s voice unique and searches a large database to find a similar pitch. Then recreate the timbre and individual sounds to create a similar effect.

Security experts argue that the artificial voice does not always sound as melodious as the original. The tone is monotonous and robotic, with strange stammers and synthetic artifacts.

Source: Clarin

- Advertisement -

Related Posts