views
Generative AI and improved machine learning algorithms have opened the floodgates to a new wave of scams. You’ve probably heard of deepfakes, where scammers use AI to create convincing fake videos of popular celebrities and, in some cases, even a person’s relative to trick them into giving them money. Now, users are facing another threat: AI voice cloning.
In this scenario, fraudsters can impersonate your family members through sophisticated voice cloning methods. Just a few weeks ago, NDTV reported an incident where an elderly man in Delhi fell victim to a scam, losing Rs 50,000. He was deceived into thinking that his cousin’s son had been kidnapped. The scammer, employing advanced tactics, applied pressure by playing a cloned voice recording of the child—successfully convincing the elderly man to transfer Rs 50,000 via Paytm in a moment of panic.
It was later revealed that the kidnapping was a hoax planned to extort money from him. Scams of this nature are slowly surfacing as scammers discover new technologies and techniques to become more convincing
AI Voice Cloning Scam: Here’s How To Be Safe
Firstly, you should always verify anything that someone tells you, whether it’s an emergency or someone trying to blackmail you. Scammers use fear to intimidate their victims and get them to panic so they can extort money from them. So, it’s important to stay calm when you receive a call like this.
And, secondly, it should be noted that AI voice cloning is a relatively new technology, especially in the hands of scammers who employ consumer-grade applications. As of now, there is still an element of sounding robotic and digital when cloning a voice. Pay attention to how sentences end and observe if there is a robotic undertone in the voice.
You may have come across popular Instagram reels featuring PM Narendra Modi’s voice being cloned in meme reels, but you can tell that it has a robotic feel, and the same applies to scam calls.
Comments
0 comment