FTC warns of AI scam tricking people that loved ones are in danger

This browser does not support the video element.

CHARLOTTE — There is an urgent warning about imposter scams that use Artificial Intelligence to mimic the voice of a loved one.

Bad actors pretend to be someone you trust and say they are in danger or need help, then ask for your money.

The Federal Trade Commission said scams like these are now the most commonly reported in the country.

Experts said scammers capitalize on poor judgments in the heat of the moment.

“Think about the stress that you’re under when you think your loved one is calling you and telling you about being in an accident,” Global Head of Bot and Risk Management, Dan Woods said. “Suddenly that stress causes you to make judgments you wouldn’t ordinarily make.”

Recommendations to stay safe include creating a code with loved ones to confirm your identity and to be skeptical.

If you aren’t sure, contact your family member another way.

(WATCH BELOW: Scammers use AI to deceive senior citizens)

This browser does not support the video element.