Deepfake Scams Targeting Parents: What to Do If You Fall Victim?

A few days ago, a woman from Arizona warned parents worldwide about the ongoing deepfake scam that uses AI to mimic the voice of your child and convince you it’s been kidnapped. Since then, many parents have stepped forward with the same testimony.

Reading Time: 3 minutes

deepfake scams

Illustration: Lenka Tomašević

AI kidnapping scam targeting parents worldwide   

A bizarre AI kidnapping scam is being spread worldwide but, as opposed to other deepfake misuses, this one is even more harmful and believable. For sure, when you see footage of a politician seemingly talking nonsense, you can brush it off and say it’s fake. It doesn’t hurt you personally and oftentimes it’s possible to fact-check it and confirm it’s, in fact, a deepfake. 

But it’s a completely different thing when you hear your own child over the phone, claiming to be in danger. It’s impossible to imagine a parent saying to their child – No, I don’t believe you’ve been kidnapped, and hanging up the phone. 

These scammers are obviously aware of this potential for emotional manipulation and use it to extract money from terrified parents. 

We’ve learned about this scam from a woman from Arizona, who testified at the end of last week before Congress about falling victim to a deepfake scam using AI to mimic her daughter’s voice. 

Namely, the scammers had called her phone and she heard her daughter crying and saying she had been kidnapped. The scammers have then taken the phone and demanded $1 million before reducing it to $50,000. 

“How did they get her voice? How did they get her crying, the sobs that are unique to her? When your child calls in need of help, will you end the call saying ‘I don’t believe it’s really you’? Is this our new normal?” the mother asked. 

Fortunately, she was able to reach her daughter before paying the ransom, and other parents who fell victim to the same type of scam claimed the same. 

How does this deepfake scam work?  

Berkeley professor and AI expert Hany Farid explained to Inside Edition how this process is even possible: 

“I can take examples of someone’s real voice by simply searching their name on YouTube, which doesn’t take more than five seconds. Then, I uploaded it to a web service that I pay $5 a month for. In about 30 seconds, it cloned your voice, I typed what I wanted you to say and it handed me an audio file back. That whole thing took me about five minutes to do,” the professor explained. 

But how did scammers use artificial intelligence to emulate her daughter’s voice and stage a fake kidnapping? 

As the mother stated, her daughter doesn’t have any social media accounts but there are interviews online regarding school and sports, that she assumes the scammers used to emulate the voice. 

Meanwhile, the daughter was on a ski trip and luckily, she picked up when her mother called her because, in situations like this, every minute matters. Her mother now decided that it’s her mission to raise awareness about this scam before more parents fall into this trap. 

According to her statement, the kidnappers demanded to have the sum in cash and to agreed to be picked up by a van. Again, this is not a sole example but has happened numerous times so far. 

“I lashed at the men for such a horrible attempt to scam and extort money. To go so far as to fake my daughter’s kidnapping was beyond the lowest of the low for money. I made a promise that I was going to stop them, that not only were they never going to hurt my daughter, but that they were not going to continue to harm others with their scheme,” the mother said for Daily Mail

How to know if you’re being scammed?   

In other words, this AI impersonation is accessible to all, without advanced tech knowledge, and is ridiculously affordable. In only a couple of minutes, you can create a trustworthy deepfake that could scam even those who are the closest to you. 

image-1

In a global survey of 7,000 people from nine countries, 1 in 4 people said they had experienced an AI voice cloning scam or knew someone who had. Moreover, 70% of the respondents said they were not confident they could 'tell the difference between a cloned voice and the real thing.

Source: Daily Mail 

What we’ve learned from this example is that, if you fall victim to this scam, make sure to dial the person whose voice you’ve heard to check if they’ll pick up. 

Moreover, as the mother in this case explained, when she contacted 911, they told her that they were already familiar with this AI scam. Not only that, but they typically brush it off as a ‘prank call’ so everything comes down to individual efforts to raise awareness and learn how to react in situations like these. 

Not all scams targeting parents or grandparents are related to kidnapping. Another example had a grandson who claimed to have crashed the car and landed in jail, requesting his grandfather to send him the money. The US officials have warned the public about this ‘grandparent scam’, where an imposter poses as a grandchild in urgent need of money in a distressful situation. 

Again, we come to the same conclusion whenever we speak of artificial intelligence; the technology is remarkable but there’s definitely a criminal abuse of this emerging technology, as this example proved. That’s why there’s a huge emphasis on the AI Act, which has recently been passed on in the European Parliament, created with the aim of regulating this ever-growing technology use. 

A journalist by day and a podcaster by night. She's not writing to impress but to be understood.