Vietnam.vn - Nền tảng quảng bá Việt Nam

Deepfake voice fraud is getting more and more sophisticated, what to do?

Deepfake voice technology allows people to imitate voices that are identical to real people, causing many people to fall into the trap of believing in familiar voices.

Báo Tuổi TrẻBáo Tuổi Trẻ09/07/2025

Deepfake Voice - Ảnh 1.

Scam by impersonating voice with deepfake voice

In the era of rapidly developing artificial intelligence, voice - one of the factors once considered authentic evidence - has now become a dangerous tool in the hands of bad guys. Deepfake voice technology allows for fake voices to be identical to real people, creating sophisticated fake calls to defraud and appropriate property.

Why is deepfake voice scary?

Deepfake voice is a technology that applies artificial intelligence (AI) and machine learning to create a fake voice that is identical to a real person's voice.

With the support of modern models such as Tacotron, WaveNet, ElevenLabs or voice cloning platforms like Respeecher, fraudsters only need 3 - 10 seconds of voice samples to create a 95% reliable deepfake.

Deepfake voice becomes especially dangerous because of its ability to mimic voices almost perfectly, from pronunciation, intonation to even each person's unique speaking habits.

This makes it very difficult for victims to distinguish between real and fake, especially when the voice belongs to their relatives, friends or superiors.

Voice mining is also very easy, as most people today expose their audio through platforms like TikTok, social media livestreams, podcasts, or online meetings. More worryingly, deepfake voices do not leave visual traces like images or videos , making investigations difficult and victims vulnerable to losing money.

Lừa đảo bằng deepfake voice ngày càng tinh vi, phải làm sao? - Ảnh 2.

Just a few seconds of voice sample can create a deepfake

Deepfake voice scams are becoming increasingly sophisticated, often using a familiar scenario: imitating an acquaintance's voice in an emergency situation to create panic and pressure victims to transfer money immediately.

In Vietnam, there was a case of a mother receiving a call from her "son" informing her that he had an accident and needed money urgently. In the UK, a company director was scammed out of more than 240,000 USD after hearing his "boss" request to transfer money over the phone. An administrative employee was also scammed when receiving a call from a "big boss" requesting payment to a "strategic partner"...

The common point in these situations is that the fake voice is reproduced exactly like a relative or superior, making the victim trust absolutely and not have time to verify.

Always verify, don't trust immediately

With the rise of deepfake voice scams, people are advised not to transfer money based on voice alone over the phone, even if it sounds exactly like a loved one. Instead, call back the old number or check the information through various channels before making any transaction.

Many experts also recommend setting up an "internal password" within the home or business for verification in unusual situations.

In addition, it is necessary to limit the posting of videos with clear voices on social networks, especially long content. In particular, it is necessary to proactively warn and guide vulnerable groups such as the elderly or those with little exposure to technology, as these are the priority targets of high-tech scams.

Deepfake Voice - Ảnh 3.

Voices of relatives, friends, and colleagues can all be faked.

In many countries, authorities have begun to tighten management of deepfake technology with their own legal framework.

In the US, several states have banned the use of deepfakes in election campaigns or to spread misinformation. The European Union (EU) passed the AI ​​Act, requiring organizations to be transparent and clearly warn if a piece of content is generated by artificial intelligence.

Meanwhile, in Vietnam, although there are no specific regulations for deepfake voices, related acts can be handled according to current law, with crimes such as fraud, property appropriation, privacy infringement, or identity forgery.

However, the reality is that technology is developing at a rate far beyond the ability of the law to monitor, leaving many loopholes that bad actors can exploit.

When voice is no longer evidence

Voice used to be something intimate and trustworthy, but with deepfake voice, it is no longer a valid proof. In the age of AI, each individual needs to have knowledge of digital defense, proactively verify and always be vigilant because a call can be a trap.

PHAN HAI DANG

Source: https://tuoitre.vn/lua-dao-bang-deepfake-voice-ngay-cang-tinh-vi-phai-lam-sao-20250709105303634.htm


Comment (0)

No data
No data

Same tag

Same category

Visit Lo Dieu fishing village in Gia Lai to see fishermen 'drawing' clover on the sea
Locksmith turns beer cans into vibrant Mid-Autumn lanterns
Spend millions to learn flower arrangement, find bonding experiences during Mid-Autumn Festival
There is a hill of purple Sim flowers in the sky of Son La

Same author

Heritage

;

Figure

;

Enterprise

;

No videos available

News

;

Political System

;

Destination

;

Product

;