In the past, criminals knocked on the door pretending to be bank employees.Today, they use Artificial Intelligence to send audios with the voice of relatives asking for money.The technology has potentiated fraud, making them more realistic, but those who know the strategies used by scammers are more likely to escape.
Leonardo Oda, a marketing and technology expert, warns that AI has changed the game.“Criminals exploit the scare and urgency to make victims act without thinking.The best way to protect themselves is to recognize the signs and adopt extra checks before making any” decision, he says.
Audio and video scams: be wary of your own perception
AI voice cloning has reached a worrying level.Currently, criminals need a few seconds of audio to replicate someone's voice accurately.This means that a simple WhatsApp message or a video on social networks can be enough to create fake audios that deceive even close family members.
To protect yourself, Oda recommends confirming the identity of the person before taking any action. If you receive a request for money by audio, call back to the contact and check the information. If the person does not answer or respond with vague phrases, such as “cannot speak now”, redouble distrust. Another effective measure is to combine a secret keyword with family members to be used in emergencies.
It is also important to pay attention to details that may report fraud.Unusual pauses, a slightly distorted tone of voice or a mechanical speech rhythm are signs of AI manipulation.
In video cases, deepfake technology makes it even harder to differentiate what is real from what has been manipulated.Labial movements misaligned with audio, artificial facial expressions, irregular blinks and distortions in the background of the image may indicate falsification.
“The best defense is to be suspicious, even if it seems real. Scammers use AI to manipulate emotions and create a sense of urgency. If something seems strange, check before acting”, reinforces Oda.
Banks and businesses do not ask for data by message
“Messages allegedly sent by banks or companies, alerting them to urgent blockages, are another common tactic”, warns Oda. With the use of AI, scammers are able to create convincing emails and SMS, simulating official communications to induce victims to click on fraudulent links.
To avoid this type of scam, the orientation is never to click on links received by email, SMS or WhatsApp. The ideal is to check directly on the official website of the bank or contact by the phone that appears on the card. It is also important never to share passwords or verification codes and to distrust alarmist messages, since criminals exploit fear to lead the victim to make hasty decisions.
Fake promotions and non-existent sweepstakes
“Congratulations! You won a zero car, you just have to pay a fee to release.” Who has never received such a message?With AI, these scams have gotten even more sophisticated.Criminals create manipulated videos and audios, using celebrity faces and voices to promote fake awards, making the scam even more convincing.
“The first step to protect yourself is to remember that if you have not signed up for any promotion, there is no reason to earn anything”, says Oda. In addition, excessively advantageous offers should be viewed with suspicion. Before believing in the promise, consult the official website of the company to verify that the promotion really exists. And under no circumstances provide banking data to release a prize.
Information and caution are the best defense
Scammers take advantage of misinformation and urgency to deceive victims.Leonardo Oda recommends that these guidelines be shared with family members, especially the elderly and adolescents, who are often frequent targets.
“The best way to avoid a blow is to slow down.If something seems strange, stop, think and confirm before making any” decision, concludes the expert.