AI Voice Cloning Attack  

There’s no doubt that generative artificial intelligence (AI) has empowered scammers to create sophisticated AI voice cloning scams. In 2023, the FTC reported that consumers lost $2.7 billion to imposter scams with fraud attacks being the most reported scam category for the year.   

It takes just three seconds of audio to clone a person’s voice, giving scammers an easy avenue to launch a broad range of scams and disinformation attacks. Seventy-three percent of Americans are concerned about AI-generated deepfake robocalls that clone the voice of a loved one to try and scam them out of money.  

To defend the voice of Americans, the FCC earlier this year rendered AI-generated robocalls without the consent of the called party illegal. The move is designed to combat malicious voice cloning, allowing for company fines and the blocking of these AI voice cloning scam calls. While this is a tangible step, Americans must remain vigilant because scammers don’t stop – they adapt. In this blog, we discuss five of the ways consumers can defend their voices.   

How to Defend Your Voice from AI Voice Cloning Scams 

Switch to Automated Voicemail Messages 

If you are like most consumers, you may set up a customized voicemail greeting that callers hear when trying to reach you. However, these recordings are long enough for scammers to record and capture the voice into AI voice cloning platforms.   

Luckily, there is a simple fix. To change your voicemail from your voice to the automated message provided by your wireless service provider on iPhone, simply click on the voicemail icon in the phone app, then select greeting in the top left corner and select default. For Android users, open the Phone app, tap on the voicemail icon or dial your voicemail number, follow the prompts to access voicemail settings and select the option to reset or revert to the default greeting.  

Create a Family Safe Word 

One of the most common imposter scams that targets American consumers is the ‘imposter family member scam’. Sometimes designed to target older family members, scammers use AI cloned voices to mimic the voice of a loved one and make it sound as if they are in peril and need immediate financial insistence.   

To avoid falling victim to these scams, consider having a safe word that only your family knows to use in emergencies or during suspected cloning activity.  

Limit Social Media Recordings  

People should always be mindful of what they post on social media channels, but following the rise of AI voice cloning scams, they should be especially careful. Phrases like “help me” make it extremely easy for scammers to capture and make those clone voices sound as if they are in danger.  

If you are actively posting video content on social media channels, including a designated safe word throughout is a subtle way for friends and family to know if a bad actor has hacked into your social platform and posted AI-generated voice content featuring you.   

Avoid Voice Biometric Verification  

Accounts that use voice biometric verification are becoming an increasing target. Any time a person needs to create new speech samples to log into accounts, these voice samples are often saved to consumers’ phones, making them easy targets for scammers to capture and manipulate into AI cloned voices.   

Facial recognition, not without its own limitations, does provide an alternative mechanism for defending sensitive information.   

Do Not Speak First to Unknown Numbers   

If you answer a call from an unknown number, wait for the person or voice on the other end of the line to speak first. As mentioned earlier, scammers only need a few seconds to record your voice, so the less you say, the better.  

As AI voice cloning scams become increasingly sophisticated, defending your voice from scammers is more crucial than ever. By implementing these five measures, you may reduce the risk of falling victim to AI-generated scams.   

How to Report AI-Generated Scam Calls 

It is best practice to never engage with unknown numbers and report phone numbers being used by scammers to your carrier. If you believe you are the victim of a scam, you can report it to your local police, state attorney’s general office and the FTC. 

Greg Bohl is Chief Data Officer at TNS with specific responsibility for TNS’ Communications Market solutions. 

TNS Enterprise Authentication

Defend the Voice of Your Organization

Enterprise Authentication and Spoof Protection allow only verified calls to reach your customers and immediately block non-authenticated calls from using your number.