A young woman is warning others about the dangers of AI after her family was targeted by scammers.
“It’s really traumatizing, it really did traumatize us,” said Payton Bock.
Bock shared the warning on TikTok recently and spoke with Fox 8 News because the person was never caught, and law enforcement says many more of these AI scammers could be out there.
“It could get really crazy really soon,” said Bock.
Here’s how this scam happened.
While Bock was at work her parents received a phone call from a man who claimed he had been in a car accident with their daughter.
The man said Payton couldn’t find her insurance card and that he was going to harm her if they didn’t give him money.
Both of Payton’s parents are street savvy and she says would never fall for such a scam except for what happened next.
“He said he was gonna kill me and that he had me in the back of his truck,” said Payton.
Her parents demanded to hear her voice… and did!
“It was literally my voice telling them ‘Mom I don’t want to die’ like I’m scared and I’m crying. It was a crazy situation,” said Payton.
The perpetrator had cloned Payton’s voice with AI.
According to Vipin Chaudhary, Chair of Computer and Data Sciences at Case Western Reserve University, three seconds is all it takes for someone to clone a person’s voice using artificial intelligence.
A voice can then be made to say anything, in any language and with any number of emotions.
“Yes, it’s very easy to fake a person from an audio perspective,” said Chaudhary, “I can have me talk in Chinese and it will be as if I’m talking so the technology is pretty amazing, and we are moving towards video so you will have videos of any of us which will be fake.”
Chaudhary says “AI is exploding” so fast it’s challenging for even experts to keep up with advances, which can be both good and bad.
“The pace is amazing,” said Chaudhary, “There is both good and bad, I think it’s how you use it, it’s a technology so it’s how we set up regulations that is going to dictate so that the bad is limited.”
In response to the AI scam, the Cleveland FBI offered these tips to anyone receiving a suspicious call.
First, they suggest an attempt to contact the person mentioned by the scammer to confirm their safety and whereabouts.
Next report it to local law enforcement.
And if the caller demands money, gift cards, cryptocurrency or other assets, call the FBI.
Payton’s family had tried to reach her, but she was busy at work.
It wasn’t until the police tracked her down that they realized it was a scam.
“I called my mom, and it was like the most frantic answer… I could just feel her pain,” said Bock.
That day Payton began sharing her real-time location with her parents through their smartphones and she recommends others do too.
She also suggests having a family password and hopes the technology will be better regulated soon.
“They don’t know the code word so they can’t AI generate the code word and have it be very specific,” said Bock, “I mean If this AI was only being used for good there wouldn’t be a need for rules, but all of this negative stuff is coming from it it’s just too scary, and the risk is too high.”