Things You Should Know But Don’t: AI Voice Clones

Posted April 1, 2024

When AI-generated deepfakes first became a concern on the internet, it was mostly centered around video. Some of the deepfakes were humorous or good-natured, such as editing actors into movies they weren’t in, but it quickly became apparent to the general public that they could be used for more sinister purposes. Deepfakes can harm and defame celebrities, politicians, or just about anyone else who has pictures or video of themselves on the internet. Nowadays it isn’t just people’s faces that can be replicated, but their voices as well.

It can be easy to forget that not everyone who has access to these new AI tools will use them for harmless fun. Cyber criminals are already harnessing the power of AI to help pull off things like phishing scams and even to write malicious code. Voice deepfakes are no exception as the tech improves and becomes more accessible.

Last year, the FTC sent out an official consumer warning about phone scams that use AI-generated voices that mimic your loved ones. They use these voice clones to make it sound as though a loved one has gotten into an emergency— such as being arrested, getting into a car accident, or ending up in the hospital— and they need you to send money to help them get out of it. They take advantage of the sense of urgency and prey on both kindness and ignorance.

According to the FTC, “All [the scammer] needs is a short audio clip of your family member’s voice — which he could get from content posted online — and a voice-cloning program.” With social media platforms widely shifting to video as their main form of content, this leaves many people vulnerable. Younger generations are especially prone to sharing content that shows both their face and voice, with 44.4% of TikTok’s users in the U.S. being 24 years old or younger.

The best advice to keep your voice from being used for a scam is to limit what you post online that showcases it. Once someone has access to your voice, though, there isn’t much that can be done to stop them from cloning it. The FTC has announced that it’s working on ways to prevent or detect AI-generated voices, but it is still in its early research stages.

However, there are ways to avoid being scammed. First, learning the red flags to look out for in these types of phone scams helps you identify them. Second, if you receive a call like this, the FTC suggests you hang up and call your loved one back directly. Third, even if you yourself aren’t worried about falling for a scam like this, make sure to tell your family about them—particularly anyone you know who may not be so tech-savvy.

Leave a Reply

Your email address will not be published. Required fields are marked *