Worl

Artificial Imposter: 47% of Indian phone users have experienced AI voice scams, highest in the world

About half of Indian adults, or 47%, have experienced AI voice scams, nearly double the global average of 25%. Furthermore, nearly 69% of adults say they can't differentiate between a real and an AI-generated voice.

Mehul Reuben Das May 02, 2023 10:28:39 IST
Artificial Imposter: 47% of Indian phone users have experienced AI voice scams, highest in the world

About half of Indian adults, or 47%, have experienced AI voice scams, nearly double the global average of 25%. Furthermore, nearly 69% of adults say they can't differentiate between a real and an AI-generated voice.

With the advancements that researchers and developers have made in AI to make it more accessible and powerful, it was only time before scammers and a few bad actors would start using the technology to scam people. However, the extent to which AI is being used by scammers, especially in voice scams is astounding. 

What is even more astounding, is the number of Indian phone users falling for AI voice scams. As per a recent McAfee report, about 47 per cent of Indian phone users have experienced AI voice scams in recent years and has the highest number of people experiencing the scam, globally. The global average, for comparison, is about 25 per cent. 

Sound-based AI models or rather, AI generators, need very little in the form of a prompt. With only three seconds of audio necessary to clone a person’s voice, sound-based AI models, are propelling an increase in online voice scams. 

Voice Cloning has become very easy thanks to AI
McAfee researchers spent three weeks studying the accessibility, simplicity of use, and usefulness of AI voice-cloning tools as part of their analysis and assessment of this emerging trend, discovering more than a dozen publicly available on the internet. 

Also read: Scammers clone girl’s voice using AI in ‘kidnapping scam,’ demand $1 million as ransom

There are both free and commercial tools available, and many just require a basic degree of skill and competence to utilise. In one case, three seconds of audio was enough to provide an 85 per cent match*, but with additional time and work, the accuracy may be increased. McAfee researchers were able to obtain a 95 per cent voice match based on a limited number of video clips by training the data models.

The more realistic the clone, the higher the chance a cybercriminal has of duping someone into turning over their money, and with these hoaxes predicated on exploiting the emotional weaknesses inherent in intimate connections, a scammer may gain thousands of dollars in a matter of hours.  

“Advanced artificial intelligence tools are altering the playing field for cybercriminals,” said Steve Grobman, McAfee CTO. “They can now clone a person’s voice and trick a close contact into sending money with very little effort,” Grobman explained. “It’s critical to stay vigilant and take proactive measures to keep yourself and your loved ones safe. 

“If you receive a call from your spouse or a family member in need of money, verify the caller by using a codeword or asking a question only they would know. Identity and privacy services will also assist to minimise the digital trace of personal information that a criminal might use to create a persuasive story when producing a voice clone,” he added.

McAfee’s researchers noticed that they had no issue mimicking accents from throughout the world, whether they were from the US, UK, India, or Australia, but that more unique voices were more difficult to replicate. For example, cloning a person’s voice with an uncommon cadence, rhythm, or style involves more work, and they are less likely to be targeted as a consequence. 

Also read: Music labels are worried because of AI, thanks to a surprising new song ft. Drake X The Weeknd

The study team’s overarching conclusion was that artificial intelligence has already altered the game for cybercriminals. The barrier to entry has never been lower, making it easier to perpetrate cybercrime.

Indians are being targeted disproportionately
Everyone’s voice is distinctive, like a biometric fingerprint, which is why hearing someone talk is such a commonly recognised method of building confidence. However, with 86 per cent of Indian adults sharing their voice data online or in recorded notes at least once a week (via social media, voice notes, and other means), cloning how someone sounds is now a powerful tool in a cyber criminal’s arsenal. That is one of the major reasons, why Indians are being targeted disproportionately. 

What makes things worse, is the fact that about 69 per cent of Indian adults were unsure if they could tell the difference between an AI-based clone and a genuine person.

More than half of Indian respondents (66 per cent) stated they would respond to a phone or voice message claiming to be from a friend or loved one in need of money. Especially if they assumed the request came from their parent (46 per cent), partner or spouse (34 per cent), or kid (12 per cent). Messages indicating that the sender had been robbed (70 per cent), engaged in a vehicle accident (69 per cent), lost their phone or wallet (65 per cent), or required assistance when travelling overseas (62 per cent), were the most likely to elicit a response.  However, the cost of falling for an AI voice scam can be significant, with 48 per cent of Indians who lost money claiming it cost them more than INR 50,000.

According to the poll, the growth of deepfakes and disinformation has made people more sceptical of what they see online, with 27 per cent of Indian adults saying they are now less trusting of social media than ever before, and 43 per cent concerned about the rise of misinformation or disinformation.

What can we do about this?
There are some steps that we can take to ensure that we do not fall for AI-based voice scammers, not very easily at least.

First, set a verbal ‘codeword’ with your children, family members, or trusted close acquaintances that only they will understand. Make a strategy to always ask for it whenever they phone, text, or email for assistance, especially if they are elderly or fragile. 

Also, make it a habit of always checking the source – pause, and consider if it’s a call, text, or email from an unknown sender, or even if it’s from a number you recognise. Is that what they sound like? Hang up and phone the individual directly, or attempt to confirm the facts before replying or giving money.

Consider your options before you click and share. Ask yourself, who are the people in your social media network? Do you truly know and believe them? Consider your internet acquaintances and relationships carefully. The more contacts you have and the more information you disclose, the more likely it is that your identity may be cloned for malevolent purposes. 

Also, consider that identity theft protection services can assist you in ensuring that your personally identifiable information is not accessible or in notifying you if it is discovered on the Dark Web. Take control of your personal data to prevent a cybercriminal from impersonating you. 

Read all the Latest NewsTrending NewsCricket NewsBollywood News,
India News and Entertainment News here. Follow us on FacebookTwitter and Instagram.

Updated Date: May 02, 2023 11:59:30 IST

TAGS: