How to see through this terrifying scam

esteria.white

Scams

Phone fraud is taking a scary turn as fraudsters can exploit AI to cause serious emotional and financial harm to victims.

Virtual kidnapping: how to demystify this terrifying scam

It’s every parent’s worst nightmare. You receive a call from an unknown number and on the other end you hear your child screaming for help. Then their “kidnapper” comes online and demands a ransom or you will never see your son or daughter again. Unfortunately, it is not an imagined scenario from a Hollywood film.

Instead, it is a terrifying example of the lengths scammers can now go to extort money from their victims, co-opting new technologies for nefarious purposes. This also shows the quality of AI voice cloning technology it’s now convincing enough to fool even close family members. Fortunately, the more people are aware of these schemes and what to look out for, the less money phone scammers are likely to make.

How virtual kidnapping works

There are several key steps in a typical virtual kidnapping scam. Generally speaking, they are as follows:

  • Scammers search for potential victims they can call and try to extort money. This step could also be optimized through the use of AI tools (more on this later).
  • The scammers identify a “kidnapping” victim – most likely the child of the person they identified in step 1. They might do this by looking through their social media or other publicly available information.
  • The group then creates an imagined scenario, making sure to make it as distressing as possible for the person they are about to call. The more afraid you are, the less likely you are to make rational decisions. Like any good attempt at social engineering, the scammers therefore want to rush the victim’s decision-making.
  • Scammers could then conduct additional open source research to calculate the best time to call. They can go through social media or other sources to solve this problem. The idea is to contact you at a time when your loved one is elsewhere, ideally on vacation, as daughter of Jennifer DeStefano.
  • The scammers then create audio deepfakes and make the call. Using readily available software, scammers will create a sound with the victim’s “voice” and use it to try to convince you that they have kidnapped a loved one. They may use other information gleaned from social media to make the scam more convincing, for example by mentioning details about the “kidnapped” that a stranger might not know.

If you fall for the scam, you will most likely be asked to pay in an untraceable way, such as cryptocurrency.

Overload virtual kidnappings

There are variations on this theme. Most concerning is the potential for ChatGPT and other AI tools to boost virtual kidnappings by making it easier for fraudsters to find ideal victims. Advertisers and marketers I’ve had it for years uses “propensity modeling” techniques to deliver the right messages to the right people at the right time.

Generative AI (GenAI) could help fraudsters do the same, finding individuals most likely to pay if exposed to a virtual kidnapping scam. They could also search for people in a specific geographic area, with public social media profiles, and from a specific socioeconomic background.

A second option would be to use a SIM swap attack against the “kidnapped” to hijack their phone number before the scam. This would add a disturbing legitimacy to the kidnapping phone call. While DeStefano was ultimately able to ensure that his daughter was safe and healthy, and therefore hang up on his extortionists, this would be much more difficult to do if the victim’s loved one is unreachable.

What future for voice cloning

Unfortunately, voice cloning technology is already very convincing, as is our recent experience proves. And it’s increasingly accessible to scammers. A intelligence report in May warned of legitimate text-to-speech tools being misused and growing clandestine cybercrime interest in voice cloning as a service (VCaaS). If the latter takes off, it could democratize the ability to launch such attacks in the cybercrime economy, especially if used in combination with GenAI tools.

In fact, next to disinformationdeepfake technology is also used to compromise business messaging (as tested by our own Jake Moore) and sextortion We are only at the beginning of a long journey.

How to stay safe

The good news is that a little knowledge can go a long way toward dispelling the threat of deepfakes in general and virtual kidnappings in particular. There are things you can do today to minimize the chances of being selected as a victim and falling for a scam call if that happens.

Consider these high-level tips:

  • Don’t share too much personal information on social media. This is absolutely crucial. Avoid posting details such as addresses and phone numbers. If possible, don’t even share photos or video/audio recordings of your family, and certainly not details of your loved ones’ vacation plans.
  • Keep your social media profiles private to minimize the chances of bad actors finding you online.
  • Be on the lookout for phishing messages which could be designed to trick you into handing over sensitive personal information or logins to social media accounts.
  • Ask children and close family to download geolocation trackers like Find My iPhone.
  • If you receive a call, let the “kidnappers” speak. At the same time, try to call the suspected kidnapped person from another line or contact someone nearby.
  • Stay calm, do not share any personal information, and if possible, ask them to answer a question that only the kidnapped would know and ask to speak to them.
  • Notify local police as soon as possible.

Virtual kidnapping is just the beginning. But stay informed about the latest scams and you have a good chance of nipping attacks in the bud before they cause serious emotional distress.

Leave a comment