© 2024 Blaze Media LLC. All rights reserved.
Fake kidnappers simulate teen's voice through AI, use it to demand ransom from her mother
Screenshot of Good Morning America YouTube video

Fake kidnappers simulate teen's voice through AI, use it to demand ransom from her mother

An Arizona mother panicked after she received a phone call from a man claiming to have kidnapped her teenage daughter, but the phone call turned out to be an AI-generated hoax.

Jennifer DeStefano of Scottsdale, Arizona, claimed she was at a dance studio with one daughter when she received a phone call about her other daughter, 15-year-old Brie. The call came from an unknown number, so DeStefano almost let it go to voicemail, but she answered at the last second because Brie was out of town on a ski trip, and DeStefano worried that something might have happened.

When she answered the call, she heard Brie's voice repeating her name. Brie was sobbing and sounded terrified. "Mom, I messed up," Brie's voice said through tears.

Then, an unknown man's voice came on the phone. He told DeStefano that he had taken Brie hostage, and he threatened to drug Brie, rape her, and drop her off in Mexico if DeStefano dared to contact the authorities. He then demanded DeStefano pay $1 million in ransom to get her daughter back.

Other moms at the dance studio rushed to DeStefano's assistance, calling police and DeStefano's husband. Thankfully, they were quickly able to determine that Brie was safe on her ski trip, oblivious to the kidnapping hoax just perpetrated on her mother.

Though DeStefano was relieved that the call was a scam, she was amazed at how closely the scammers had recreated her daughter's voice. "It was completely her voice," DeStefano said. "It was her inflection. It was the way she would have cried.

"I never doubted for one second it was her," she added. "That’s the freaky part that really got me to my core."

Though AI-generated voices are nothing new, Subbarao Kambhampati, a computer science professor at Arizona State University, claims that advancements in AI technology, much of which is freely available online, allow scammers to recreate someone's voice using just a short sample.

"Now there are ways in which you can do this with just three seconds of your voice," Prof. Kambhampati said. "Three seconds. And with the three seconds, it can come close to how exactly you sound."

"You can no longer trust your ears," he claimed.

With videos abounding on social media and television, many people have unwittingly put themselves at risk of voice cloning, so law enforcement has come up with ways to help people avoid becoming victims of a voice scam:

  • Ask the alleged abductee questions that only he or she could answer.
  • Be wary of unfamiliar numbers, especially international calls or those from an unknown area code.
  • Establish a "safe word" with loved ones that they will share only if they are ever in danger.
"It all just seemed so real," recalled DeStefano, who has now made it her mission to alert others about the latest hoax. "The only way to stop this is with public awareness."

Like Blaze News? Bypass the censors, sign up for our newsletters, and get stories like this direct to your inbox. Sign up here!

Want to leave a tip?

We answer to you. Help keep our content free of advertisers and big tech censorship by leaving a tip today.
Want to join the conversation?
Already a subscriber?