Scams have always existed. In the past, criminals were knocking on the door pretending to be bank employees. Today, uses Artificial Intelligence to send audio messages with the voice of relatives asking for money. Technology has enhanced frauds, making them more realistic, but those who know the strategies used by scammers have a better chance of escaping
Leonardo Oda, marketing and technology specialist, alert that AI has changed the game. Criminals exploit fear and urgency to make victims act without thinking. That's why, the best way to protect yourself is to know how to recognize the signs and adopt extra checks before making any decision, affirms
Audio and video scams: be wary of your own perception
AI voice cloning has reached a concerning level. Currently, criminals need only a few seconds of audio to replicate someone's voice accurately. This means that a simple WhatsApp message or a video on social media can be enough to create fake audios that deceive even close family members
To protect oneself, Oda recommends confirming the person's identity before taking any action. If you receive a request for money by audio, call back the contact and verify the information. If the person does not answer or responds with vague phrases, like "I can't talk right now", double the suspicion. Another effective measure is to combine a secret keyword with family members to be used in emergencies
It is also important to pay attention to details that may reveal fraud. Uncommon pauses, a slightly distorted tone of voice or a mechanical speech rhythm are signs of manipulation by AI
In video cases, deepfake technology makes it even harder to differentiate what is real from what has been manipulated. Lip movements misaligned with the audio, artificial facial expressions, irregular flickers and distortions in the background of the image may indicate forgery. Furthermore, the voice may have unusual intonation or robotic pauses
"The best defense is to be suspicious", even if it seems real. Scammers use AI to manipulate emotions and create a sense of urgency. If something seems strange, "check before acting", reinforces Oda
Banks and companies do not ask for data via message
"Messages supposedly sent by banks or companies", alerting about urgent blockages, are another common tactic, alert Ode. With the use of AI, scammers can create convincing emails and SMS, simulating official communications to induce victims to click on fraudulent links
To avoid this type of scam, the guidance is to never click on links received by email, SMS or WhatsApp. The ideal is to check directly on the official bank website or contact the phone number listed on the card. It is also important to never share passwords or verification codes and to be wary of alarming messages, since criminals exploit fear to lead the victim to make hasty decisions
Fake promotions and non-existent sweepstakes
Congratulations! You won a brand new car, you just need to pay a fee to release."Who has never received a message like this"? With AI, these scams have become even more sophisticated. Criminals create manipulated videos and audios, using the faces and voices of celebrities to promote fake prizes, making the farce even more convincing
"The first step to protect yourself is to remember that", if you haven't signed up for any promotion, there is no reason to gain anything, says Oda. Furthermore, excessively advantageous offers should be viewed with suspicion. Before believing in the promise, check the company's official website to see if the promotion really exists. AND, under no circumstances, provide bank details to release a prize
Information and caution are the best defense
Scammers take advantage of misinformation and urgency to deceive victims. Leonardo Oda recommends that these guidelines be shared with family members, especially the elderly and teenagers, that are often frequent targets
"The best way to avoid a scam is to slow down". If something seems strange, stop, think and confirm before making any decision, concluded the expert