Fraud alert! 47% of Indians have experienced AI voice scams, says McAfee survey

[ad_1]

A survey conducted by McAfee Corp reveals that more than half (69%) of Indians think they don’t know or cannot tell the difference between an AI voice and real voice. About half (47%) of Indian adults have experienced or know someone who has experienced some kind of AI voice scam, which is almost double the global average (25%). 83% of Indian victims said they had a loss of money- with 48% losing over INR 50,000.

The published report, The Artificial Imposter, focuses on how artificial intelligence (AI) technology is fueling a rise in online voice scams, with just three seconds of audio required to clone a person’s voice. The survey was conducted with 7,054 people from seven countries, including India.

“Artificial Intelligence brings incredible opportunities, but with any technology there is always the potential for it to be used maliciously in the wrong hands. This is what we’re seeing today with the access and ease of use of AI tools helping cybercriminals to scale their efforts in increasingly convincing ways,” said Steve Grobman, McAfee CTO.

Your voice is like a fingerprint
Everybody’s voice is unique, the spoken equivalent of a biometric fingerprint, which is why hearing somebody speak is such a widely accepted way of establishing trust. But with 86% of Indian adults sharing their voice data online or in recorded notes at least once a week (via social media, voice notes and more.), cloning how somebody sounds is now a powerful tool in the arsenal of a cybercriminal.

With the rise in popularity and adoption of artificial intelligence tools, it is easier than ever to manipulate images, videos, and perhaps most disturbingly, the voices of friends and family members. The research reveals scammers are using AI technology to clone voices and then send a fake voicemail or voice note or even call directly the victim’s contacts pretending to be in distress – and with 69% of Indian adults not confident that they could identify the cloned version from the real thing, it’s no surprise that this technique is gaining momentum.

More than half (66%) of the Indian respondents said they would reply to a voicemail or voice note purporting to be from a friend or loved one in need of money. Particularly if they thought the request had come from their parent (46%), partner or spouse (34%), or child (12%). Messages most likely to elicit a response were those claiming that the sender had been robbed (70%), was involved in a car incident (69%), lost their phone or wallet (65%) or needed help while travelling abroad (62%). But the cost of falling for an AI voice scam can be significant, 48% of Indians who’d lost money saying it had cost them over INR 50,000.

The survey also found that the rise of deepfakes and disinformation has led to people being more wary of what they see online, with 27% of Indian adults saying they’re now less trusting of social media than ever before and 43% being concerned over the rise of misinformation or disinformation.How to Protect Yourself from AI Voice Cloning:

Set a verbal ‘codeword’ with kids, family members or trusted close friends that only they could know. Make a plan to always ask for it if they call, text or email to ask for help, particularly if they’re older or more vulnerable.

Always question the source – If it’s a call, text or email from an unknown sender, or even if it’s from a number you recognize, stop, pause and think. Does that really sound like them? Hang up and call the person directly or try to verify the information before responding and certainly before sending money.

Think before you click and share – Who is in your social media network? Do you really know and trust them? Be thoughtful about the friends and connections you have online. The wider your connections and the more you share, the more risk you may be opening yourself up to that your identity may be cloned for malicious purposes.

[ad_2]

Source link


Leave a Reply

Your email address will not be published. Required fields are marked *