This is frightening and highlights the danger of artificial intelligence (AI)..An Arizona woman Jennifer DeStefano answered a call from a phone number she didn’t recognize. She heard a male voice she wasn’t familiar with saying he had kidnapped her daughter and wanted $1 million in ransom..She then recognized her 15-year-old daughter’s voice saying, through tears, “Mom, I messed up.”.But it wasn’t her daughter. It was AI technology that sounded just like her voice..DeStefano was at a dance studio with her other daughter when she received the phone call..She told KPHO TV News she decided to answer the call because her daughter, Brie, was on a skiing trip and she feared there may have been an accident..“I pick up the phone and I hear my daughter’s voice, and it says, ‘Mom!’ and she’s sobbing,” DeStefano told KPHO. “I said, ‘What happened?’ And she said, ‘Mom, I messed up,’ and she’s sobbing and crying.”.DeStefano said she heard a man’s voice say, “Put your head back, lie down.”.Her confusion turned to terror, she told the news outlet..“This man gets on the phone, and he’s like, ‘Listen here. I’ve got your daughter. This is how it’s going to go down. You call the police, you call anybody, I’m going to pop her so full of drugs. I’m going to have my way with her and I’m going to drop her off in Mexico,’” said DeStefano. “And at that moment, I just started shaking. In the background, she’s going, ‘Help me, Mom. Please help me. Help me,’ and bawling.”.The man then demanded money, first $1 million, then lowered it to $50,000 when told by DeStefano she didn’t have that kind of money..At the dance studio, one worried mom called 911 and another called DeStefano’s husband..KPHO reports, “Within just four minutes, they confirmed her daughter was safe.”.“She was upstairs in her room going, ‘What? What’s going on?’” said DeStefano. “Then I get angry, obviously, with these guys. This is not something you play around with.”.At that point, she hung up, but she said there was no doubt in her mind that it was her daughter’s voice..“It was completely her voice. It was her inflection. It was the way she would have cried,” she said. “I never doubted for one second it was her. That’s the freaky part that really got me to my core.”.Blaze News reported, “Though AI-generated voices are nothing new, Subbarao Kambhampati, a computer science professor at Arizona State University, claims that advancements in AI technology, much of which is freely available online, allow scammers to recreate someone's voice using just a short sample.”.“Now, there are ways in which you can do this with just three seconds of your voice,” Prof. Kambhampati said. “Three seconds. And with the three seconds, it can come close to how exactly you sound.”.“You can no longer trust your ears,” he said..AI learning technology has very little oversight, and, says Kambhampati, it's getting much easier to access and use..“It’s a new toy, and I think there could be good uses, but certainly, there can be pretty worrisome uses too,” he said..“With videos abounding on social media and television, many people have unwittingly put themselves at risk of voice cloning,” said Blaze News, adding law enforcement agencies have come up with ways to help people avoid becoming victims of a voice scam:.Ask the alleged abductee questions that only he or she could answerBe wary of unfamiliar numbers, especially international calls or those from an unknown area codeEstablish a “safe word” with loved ones that they will share only if they are ever in danger.“It all just seemed so real,” said DeStefano. “The only way to stop this is with public awareness.”.Watch interview
This is frightening and highlights the danger of artificial intelligence (AI)..An Arizona woman Jennifer DeStefano answered a call from a phone number she didn’t recognize. She heard a male voice she wasn’t familiar with saying he had kidnapped her daughter and wanted $1 million in ransom..She then recognized her 15-year-old daughter’s voice saying, through tears, “Mom, I messed up.”.But it wasn’t her daughter. It was AI technology that sounded just like her voice..DeStefano was at a dance studio with her other daughter when she received the phone call..She told KPHO TV News she decided to answer the call because her daughter, Brie, was on a skiing trip and she feared there may have been an accident..“I pick up the phone and I hear my daughter’s voice, and it says, ‘Mom!’ and she’s sobbing,” DeStefano told KPHO. “I said, ‘What happened?’ And she said, ‘Mom, I messed up,’ and she’s sobbing and crying.”.DeStefano said she heard a man’s voice say, “Put your head back, lie down.”.Her confusion turned to terror, she told the news outlet..“This man gets on the phone, and he’s like, ‘Listen here. I’ve got your daughter. This is how it’s going to go down. You call the police, you call anybody, I’m going to pop her so full of drugs. I’m going to have my way with her and I’m going to drop her off in Mexico,’” said DeStefano. “And at that moment, I just started shaking. In the background, she’s going, ‘Help me, Mom. Please help me. Help me,’ and bawling.”.The man then demanded money, first $1 million, then lowered it to $50,000 when told by DeStefano she didn’t have that kind of money..At the dance studio, one worried mom called 911 and another called DeStefano’s husband..KPHO reports, “Within just four minutes, they confirmed her daughter was safe.”.“She was upstairs in her room going, ‘What? What’s going on?’” said DeStefano. “Then I get angry, obviously, with these guys. This is not something you play around with.”.At that point, she hung up, but she said there was no doubt in her mind that it was her daughter’s voice..“It was completely her voice. It was her inflection. It was the way she would have cried,” she said. “I never doubted for one second it was her. That’s the freaky part that really got me to my core.”.Blaze News reported, “Though AI-generated voices are nothing new, Subbarao Kambhampati, a computer science professor at Arizona State University, claims that advancements in AI technology, much of which is freely available online, allow scammers to recreate someone's voice using just a short sample.”.“Now, there are ways in which you can do this with just three seconds of your voice,” Prof. Kambhampati said. “Three seconds. And with the three seconds, it can come close to how exactly you sound.”.“You can no longer trust your ears,” he said..AI learning technology has very little oversight, and, says Kambhampati, it's getting much easier to access and use..“It’s a new toy, and I think there could be good uses, but certainly, there can be pretty worrisome uses too,” he said..“With videos abounding on social media and television, many people have unwittingly put themselves at risk of voice cloning,” said Blaze News, adding law enforcement agencies have come up with ways to help people avoid becoming victims of a voice scam:.Ask the alleged abductee questions that only he or she could answerBe wary of unfamiliar numbers, especially international calls or those from an unknown area codeEstablish a “safe word” with loved ones that they will share only if they are ever in danger.“It all just seemed so real,” said DeStefano. “The only way to stop this is with public awareness.”.Watch interview