There is a new scam using artificial intelligence to clone voices. Arizona mother Jennifer DeStefano shared her story of a kidnapping hoax that sounded all too real.
Jennifer said when she recently received a call from an unknown number, she almost didn't answer. But her 15-year-old daughter Briana was on a skiing trip and she worried it could be an emergency, so she picked up.
"It's my daughter's voice crying and sobbing, um, saying, 'mom,'" Jennifer said. "And I'm like, 'okay, what happened?' She's like, 'mom, these bad men have me help me, help me.'"
Jennifer said a man demanded ransom in exchange for Briana. He didn't want a wire transfer but instead told her he'd pick her up.
RELATED: Mom falls victim to phone kidnapping scam after caller falsely claims to have abducted son
"I started to wonder, like, if these people were, like, asking to track my mom and, like, pick her up," Briana said. "Like, they could have obviously been, like, putting some information together to try and track me or some of my siblings to actually make this a reality. Um, so it definitely scared me."
Within minutes they confirmed her daughter was safe. Jennifer said it was all a scheme using artificial intelligence to replicate her daughter's voice.
"Most people in the modern age have some form of an online identity and have probably spoken in some way in some aspect that's been recorded, especially if you're under the age of 25," futurist and WAYE founder Sinead Bovell said. "So this becomes very, very challenging as we move into a future where we do have these AI generators or synthetic audio when it comes to verification and validation."
Experts caution that anyone with the right software can clone voices in just a matter of seconds.
RELATED: Thieves can use ChatGPT to write convincing scam messages with human-like language, experts warn
"There's a lot of positive and exciting aspects about these technologies," Bovell said. "But then of course, they also come with a lot of risks and harms."