CHARLOTTE — If your best friend or relative called and told you they were in trouble or in need of money, would you stop to think if that could really be their voice?
Maybe you did before, but Channel 9′s Erika Jackson learned why you can’t trust everything you hear, thanks to artificial intelligence.
She spoke with a local financial crimes detective and a UNC Charlotte cybersecurity professor to learn how scammers use A.I. to mimic your loved ones’ voices and scam you out of money.
“That voice is harvested from social media, videos, posts, things like that,” said Alec Campbell, CMPD financial crimes detective. “And then using the AI software, they manipulate that voice recording.”
Detective Campbell says he’s investigated multiple local scams incorporating AI.
“All kinds of people fall for these games, not just the elderly, but they seem to be targeted specifically,” Detective Campbell explained.
>>> Can you identify which voice is real? Play the game below.
UNCC cybersecurity professor Dr. Bill Chu showed Jackson how easy it is to create a fake voice.
Dr. Chu said scammers can get your voice from a public social media post and use a caller ID spoofing device to act like the call is coming from your number.
Through free or paid websites, scammers can upload samples of voices, input a script and use templates to mimic tone and inflection to sound like a real person.
“You can’t believe anything anymore, you can’t believe in video anymore, you can’t believe audio,” explained Dr. Chu.
Jackson uploaded her own voice to one of these websites and tested a friend to see if she believed the clone.
“When I sent you that voice memo about Christmas Eve, did anything sound weird to you,” Jackson said to her friend on a call after sending a voice memo using AI.
“You sounded muffled,” her friend responded.
“That was actually an AI clone of my voice.”
“Ew! Erika!”
“You thought it was me?”
“Yeah, was it not?”
“That wasn’t me.”
“Is it you right now? Am I talking to AI?”
>>>In the video at the top of the page, see Channel 9′s Erika Jackson put AI voice cloning to the test.
Jackson asked the team behind the AI voice generator websites about its intended use for the product and if they were aware it could be used by scammers.
A spokesperson for Resemble AI told Channel 9 that its policies and guidelines require customers to have consent to use the voice that’s being cloned.
It also prohibits users from making a voice clone that’s deceptive or promotes illegal activities.
Jackson shares these tips for how to prevent falling victim -- on either side of the call -- to AI voice scams.
- Share carefully on social media
- Be cautious of unsolicited calls claiming to be a loved one
- Never send money or buy gift cards in response to a phone call
- Don’t act without thinking
The Federal Trade Commission announced earlier in November that it’s accepting ideas from companies on how to prevent and detect the use of AI voice cloning.
(WATCH: Biden Administration rolls out sweeping guidelines for artificial intelligence)
This browser does not support the video element.