Artificial Intelligence (AI)-generated voice cloning has emerged as a potent tool for cyber criminals, with instances of extortion and fraud rising across India. According to the National Crime Records Bureau (NCRB), Delhi, the national capital, witnessed a staggering increase in cybercrime cases, with 685 cases reported in 2022 compared to 345 in 2021 and 166 in 2020.
In a recent incident, a senior citizen from Delhi’s Yamuna Vihar fell victim to scammers who employed AI-generated voice cloning to extort money. Lakshmi Chand Chawla received a ransom demand via WhatsApp, featuring a child’s voice cloned using AI technology. Panicked by the realistic voice, Mr. Chawla complied with the scammers’ demands and transferred ₹ 50,000 via Paytm.
Voice cloning technology requires just a few seconds of audio input to recreate someone’s voice with startling accuracy. According to security software company McAfee, even individuals with basic expertise can produce a clone with an 85 per cent voice match to the original. Further advancements could enhance accuracy, with McAfee researchers achieving a 95 per cent voice match based on a small number of audio files.
Fraudsters exploit this technology to perpetrate scams, such as the family emergency scam witnessed in Mr. Chawla’s case. By creating replicas of distressed family members’ voices, they manipulate victims into complying with their demands.
- Advertisement -
To avoid falling victim to AI voice cloning scams, individuals can take several precautions:
- Enable Caller ID: Always activate the caller ID feature on your smartphone to identify callers and their locations. This feature helps distinguish legitimate calls from potential scams.
- Avoid Sharing Sensitive Information: Refrain from sharing sensitive information, including phone numbers and email IDs, especially with unknown individuals or suspicious contacts.
- Implement Call Blocking: Utilize the call-blocking feature on your smartphones to prevent unwanted calls and potential scam attempts.
By adopting these preventive measures, individuals can mitigate the risk of falling prey to AI voice cloning scams and protect themselves from financial loss and emotional distress. Stay vigilant and cautious, especially when dealing with unexpected or urgent requests for money or personal information.