Can AI help someone in carrying out a fake kidnapping scam against you or your family?

Can AI help someone in carrying out a fake kidnapping scam against you or your family?


You can feel confident in your ability to avoid becoming a victim of cyber scams, You know what to look for, and you won’t let anyone fool you.

Then you get a call from your son, which is unusual because he rarely calls. You hear screams and what sounds like a scuffle, so you take immediate notice. Suddenly, you hear a voice that you are completely sure is your son, screaming for help. When alleged kidnapper Come on the line and ask for money to keep your son safe, you are sure that everything is real because you have heard his voice.

Unfortunately, scammers are using artificial intelligence (AI) to mimic people’s voices, potentially turning these fake voices into something like a kidnap scam. This particular scam appears to be rare, but it does happen.

Click to receive Kurt’s free cyber newsletter with security alerts, quick video tips, tech reviews, and easy ways to make you smarter

An example of a scammer. (Curt “Cyberguy” Knutson)

How often are fake kidnap calls enhanced with AI?

Such fake emergency scams happen frequently Federal Trade Commission (FTC) Warnings and examples were provided for consumers. However, hard numbers indicating the frequency of these calls are not readily available, especially for calls known to use AI.

Such scams are certainly possible with current AI technology. Fake videos and audios of politicians and other famous people are continuously surfacing. With the help of AI, these clips are frighteningly believable.

You may remember what happened at the end of 2023 A fake dental plan advertisement Which included Tom Hanks. AI technology made the video. Hanks was forced to make a social media post explaining the fake ad.

a dark warehouse

Empty warehouse with chair. (Curt “Cyberguy” Knutson)

MORE: ‘Unsubscribe’ email scam is targeting Americans

How does AI fake call work?

AI technology creates the fake by analyzing a sample audio clip of the person it wants to imitate. It uses its ability to interpret incredible amounts of data to take into account the many characteristics of the person’s voice, allowing it to create highly realistic fakes.

Once AI is capable of creating In the fake audio, programmers then tell him what to say, crafting a personalized message designed to sell a dental plan or convince you that your loved one is in trouble with kidnappers.

Some AI programmers who use simulated audio for assistive purposes – such as allowing people with medical problems such as ALS to regain their “speech” – claim that they can achieve this by mimicking a voice with an audio clip of a few minutes. Can. However, the more audio available, the more realistic the imitated sound should be. For example, twenty minutes of audio is much better than three.

As AI capabilities continue to expand at a rapid pace, you can expect time requirements to decrease in future years.

What is Artificial Intelligence (AI)?

artificial intelligence illustration

An example of artificial intelligence. (Curt “Cyberguy” Knutson)

MORE: How to avoid brushing scams

Do I need to worry about getting caught in a fake AI audio hijacking scheme?

Realistically, most people don’t have to worry about a fake kidnapping scheme that originates from AI-generated audio. However, if your loved one has a lot of video and audio on social media, scammers may be able to find enough source audio to create a realistic fake.

Although AI makes this type of scam easier to execute, the setup process still remains very time-consuming for most scammers. After all, the scammers in this type of scheme are counting on your rapidly increasing fear upon receiving this type of call, causing you to miss obvious clues that will tell you that it is a fake.

Cheaters may simply have a random baby who screams and sobs uncontrollably, while you may rapidly jump to the conclusion that it is your baby. This is much easier than using AI to create and generate audio sources… at least for now.

Data on woman's image

A woman surrounded by data. (Curt “Cyberguy” Knutson)

MORE: How scammers use AI tools to file a tax return that looks perfect in your name

Steps you can take to protect yourself from fake kidnapping scams

Even though scammers try to get an edge by tricking you and the suddenness of fake kidnap calls, there are some steps you can take to prepare and protect yourself before and after you receive this type of call. Can.

1. Ask your loved ones to keep you informed about trips: Fake kidnappers may try to convince you that the kidnapping is taking place outside your city. However, if you know that your loved one has not left town, you can be confident that the call is probably fake.

2. Set a safe word or phrase: Set a safe word that your loved ones should use if they are ever calling you due to a dangerous situation or under pressure. A scammer will not know this safe word. If you don’t hear the safe word, you know it’s probably a fake call.

Get Fox Business by clicking here

3. Use privacy settings on social media: Ask your family members to limit who can see their social media posts. This will make it harder for a scammer to get source audio that is usable in a fake kidnap audio call. For more information on maintaining and protecting your online privacy, click here,

4. Try messaging your loved one: During or immediately after a call, send a text message to your loved one without telling the caller. Ask your loved one to message you immediately, so you can communicate without letting scammers know. If you get a message back, you can be assured that the call is fake. Consider creating a code word that you can use with the whole family. When you send this code word in a text, everyone knows it is a serious situation that requires an immediate response.

5. Stay calm and consider things: Ultimately, although it is incredibly difficult to remain calm when you get this type of call, it is important to keep thinking clearly. Don’t panic. Be it a real call or a scam call, panicking will never help. Listen for clues that make it clear the call is a scam. Try to gather some information that can help you make a clear decision about the legitimacy of the call.

Kurt’s highlights

As AI becomes more readily available and sophisticated, scammers will be ready to take advantage of it. Perhaps by then, AI will even the playing field with ways to help us keep ourselves safe. Until then, taking steps to protect your family, such as setting a safe word, can give you peace of mind.

Click here to get the Fox News app

Are you concerned about how scammers may take advantage of AI to create new scams? Let us know by writing here cyberguy.com/contact

For more of my tech tips and security alerts, subscribe to my free CyberGuy Report newsletter cyberguy.com/newsletter

Ask Kurt a question or let us know what stories you’d like us to cover,

Answers to the most frequently asked CyberGuy questions:

Copyright 2024 CyberGuy.com. All rights reserved.


Comments

No comments yet. Why don’t you start the discussion?

    Leave a Reply

    Your email address will not be published. Required fields are marked *