By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
Khabar Har DinKhabar Har DinKhabar Har Din
  • News
    • Auto
    • Sports
    • india
    • lifestyle
    • Travel
    • world
    • Tech
    • Business
    • Economy
    • education-career
  • Entertainment
    • Bhojpuri
    • Bollywood
    • Hollywood
    • Marathi
    • Movie reviews
    • Music
    • Punjabi
    • Telugu
  • Politics
    • Health
  • Trending
    • viral
  • Web Stories
Search
© 2022 Foxiz News Network. Ruby Design Company. All Rights Reserved.
Reading: AI Voice Cloning: What It Is And How To Avoid Getting Scammed By It
Share
Sign In
Notification Show More
Aa
Khabar Har DinKhabar Har Din
Aa
Search
Have an existing account? Sign In
Follow US
© 2022 Foxiz News Network. Ruby Design Company. All Rights Reserved.
Khabar Har Din > News > Trending > AI Voice Cloning: What It Is And How To Avoid Getting Scammed By It
TechTechnologyTrending

AI Voice Cloning: What It Is And How To Avoid Getting Scammed By It

News Desk
Last updated: 2024/02/13 at 9:50 AM
News Desk
Share
3 Min Read
AI having your voice cloned and used against you 1
SHARE

Artificial Intelligence (AI)-generated voice cloning has emerged as a potent tool for cyber criminals, with instances of extortion and fraud rising across India. According to the National Crime Records Bureau (NCRB), Delhi, the national capital, witnessed a staggering increase in cybercrime cases, with 685 cases reported in 2022 compared to 345 in 2021 and 166 in 2020.

In a recent incident, a senior citizen from Delhi’s Yamuna Vihar fell victim to scammers who employed AI-generated voice cloning to extort money. Lakshmi Chand Chawla received a ransom demand via WhatsApp, featuring a child’s voice cloned using AI technology. Panicked by the realistic voice, Mr. Chawla complied with the scammers’ demands and transferred ₹ 50,000 via Paytm.

Voice cloning technology requires just a few seconds of audio input to recreate someone’s voice with startling accuracy. According to security software company McAfee, even individuals with basic expertise can produce a clone with an 85 per cent voice match to the original. Further advancements could enhance accuracy, with McAfee researchers achieving a 95 per cent voice match based on a small number of audio files.

Fraudsters exploit this technology to perpetrate scams, such as the family emergency scam witnessed in Mr. Chawla’s case. By creating replicas of distressed family members’ voices, they manipulate victims into complying with their demands.

- Advertisement -

To avoid falling victim to AI voice cloning scams, individuals can take several precautions:

  1. Enable Caller ID: Always activate the caller ID feature on your smartphone to identify callers and their locations. This feature helps distinguish legitimate calls from potential scams.
  2. Avoid Sharing Sensitive Information: Refrain from sharing sensitive information, including phone numbers and email IDs, especially with unknown individuals or suspicious contacts.
  3. Implement Call Blocking: Utilize the call-blocking feature on your smartphones to prevent unwanted calls and potential scam attempts.

By adopting these preventive measures, individuals can mitigate the risk of falling prey to AI voice cloning scams and protect themselves from financial loss and emotional distress. Stay vigilant and cautious, especially when dealing with unexpected or urgent requests for money or personal information.

You Might Also Like

Pakistan Crisis: India’s Masterstroke… Without Fighting a War, Pakistan Loses ₹4 Billion Every Day! Beggars Can’t Be Choosers

Modi’s Ultimatum! ‘Sindoor’ Was Just a Trailer, Pakistan’s Fate Hangs in the Balance?

‘Scalp’s Reign of Terror! Terrorists Eliminated from 250 KM Away, Screams Echo in Pakistan!

A Degree is Not Enough! Joydeep Dutta’s Journey Proves Why Skills Matter More Than Education!

Watch: MS Dhoni Surprises Fan with Autograph and a Royal Enfield Ride!

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.

By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Twitter Copy Link Print
Share
Previous Article b0deef9ba5ece0a4286953044235dc1cab01e7f10ec32b419cc0fc04a7e015cd.0 On Valentine’s Day, &TV Artists Share Memorable Gifts and Moments of Love!
Next Article IMG 20240215 WA0051 Colour Yellow Productions’ Aanand L Rai Announces Saeed Akhtar as Head of Business; Elevates Sandeep Nair to Head of Content
Leave a review Leave a review

Leave a review Cancel reply

Your email address will not be published. Required fields are marked *

Please select a rating!

Stay Connected

235.3k Followers Like
69.1k Followers Follow
11.6k Followers Pin
56.4k Followers Follow
136k Subscribers Subscribe
4.4k Followers Follow

Latest News

Mariella Zanoletti
Mariella Zanoletti: The 25-Year Journey Guiding Executives and Dynasties to Clarity
lifestyle Health August 25, 2025
IMG 20250507 233639
Pakistan Crisis: India’s Masterstroke… Without Fighting a War, Pakistan Loses ₹4 Billion Every Day! Beggars Can’t Be Choosers
india Trending world May 7, 2025
IMG 20250507 231731
Modi’s Ultimatum! ‘Sindoor’ Was Just a Trailer, Pakistan’s Fate Hangs in the Balance?
india Trending world May 7, 2025
IMG 20250507 232607
‘Scalp’s Reign of Terror! Terrorists Eliminated from 250 KM Away, Screams Echo in Pakistan!
india News Trending May 7, 2025
- Advertisement -
//

We influence 20 million users and is the number one business and technology news network on the planet

Quick Link

Top Categories

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!

Khabar Har DinKhabar Har Din
Follow US
© 2022-24 Khabar har din. All Rights Reserved.
Logo-Khabar-Har
Join Us!

Subscribe to our newsletter and never miss our latest news, podcasts etc..

Zero spam, Unsubscribe at any time.
Go to mobile version
Logo-Khabar-Har Logo-Khabar-Har
Welcome Back!

Sign in to your account

Lost your password?