Table of Contents
Context: Voice clone fraud has been on the rise in India. A report published in May last year revealed that 47% of surveyed Indians have either been a victim or knew someone who had fallen prey to an AI-generated voice scam.
Mechanism Of Voice Cloning
- Scammers can create a voice imitation by uploading a person’s voice recording to software like Murf, Resemble, or Speechify, which accurately replicates voices with some limitations in tone.
- Voice cloning technology uses advanced deep learning methods, including recurrent neural networks (RNNs) and convolutional neural networks (CNNs), to detect intricate speech patterns and synthesise voices that sound convincingly real.
We’re now on WhatsApp. Click to Join
Key Findings Of The Report
- The report “The Artificial Imposter” revealed that 47% of Indians surveyed have encountered AI voice scams, which is almost twice the global average.
- In the context of AI voice scam victims, India has the highest number globally.
- According to McAfee, two-thirds of Indian respondents would likely respond to urgent monetary requests from calls mimicking friends or family.
- Scam messages feigning robbery, car accidents, lost phones or wallets, or the need for travel-related financial help were notably effective.
- The frequent sharing of voice data online by 86% of Indians increases the effectiveness of these scams.