How to avoid AI voice scams | Inquirer Technology

How to avoid AI voice scams

/ 09:41 AM March 23, 2024

Think of a robot’s voice. What does it sound like? Does it sound stilted and stuttering? Perhaps it sounds like Arnold Schwarzenegger from the popular Terminator robot movies?

Believe it or not, AI voices can replicate yours almost perfectly. It could mimic your inflections and intonations, making it nearly indistinguishable from your true voice.

Imagine if someone uses your voice to ask for money for an emergency. This is a real-life scenario spreading worldwide as AI voice scams become more prevalent.

Article continues after this advertisement

Fortunately, you have ways to defend yourself against AI-powered schemes. Let’s discuss the most recent methods.

FEATURED STORIES

The 4 ways to protect yourself from AI voice scams

  1. Fact-check and verify.
  2. Listen for unusual cues.
  3. Avoid sharing your voice online.
  4. Have a family password.

1. Fact-check and verify

This represents a victim of an AI voice scam.
Free stock photo from Pexels

NBC News reported that AI-powered voice calls discouraged New Hampshire citizens from voting in the upcoming presidential primary last January. 

That is why CNBC reminded the public to fact-check and verify phone calls to ensure they aren’t scams. For example, be skeptical if a high-ranking official calls you directly.

Article continues after this advertisement

READ: US regulates AI-generated voice calls

Article continues after this advertisement

After all, would you believe President Bongbong Marcos would call every Filipino personally through the phone?

Article continues after this advertisement

The same goes if you get a call from a relative requesting a huge sum of money. Ask yourself if your loved one is truly in danger and if the emergency is real.

If you suspect something’s wrong, drop the call and then reach your relative through their contact details. They might confirm that they weren’t in immediate danger.

Article continues after this advertisement

2. Listen for unusual cues

This represents a victim of an AI voice scam.
Free stock photo from Pexels

If you remember, the introduction says artificial intelligence can mimic your voice “almost perfectly.” That’s because AI voice calls may still leave subtle signs of artificial intelligence.

For example, you might notice robotic speech patterns, unnatural pauses, or alterations in speaking tone. 

You might suspect something’s off if the distressed person suddenly becomes monotone. Also, you may hear pronunciation errors, especially for non-English words.

These cues can be difficult to notice when you’re panicked in response to someone in need. Nonetheless, watching out for these signs can help you avoid AI voice scams.

3. Avoid sharing your voice online

This represents a victim of an AI voice scam.
Free stock photo from Pexels

Reem Alattas, the Director of Value Advisory for Spend Management at the German tech firm SAP, posted on LinkedIn about avoiding AI voice scams. 

She warned people not to post videos or audio of their voices online as much as possible.

READ: How to avoid Gmail verification scams

Strangers could take those samples and feed them to an AI voice generator to replicate your likeness.

Of course, that can be difficult as we post videos of ourselves on social media for fun and work. That is why you should minimize the risk by restricting your profiles.

4. Have a family password

This represents a victim of an AI voice scam.
Free stock photo from Pexels

The Electronic Frontier Foundation is a nonprofit organization “defending civil liberties in the digital world,” and it shared advice on avoiding AI voice-cloning scams, too.

It suggests having a family password. Discuss with your family a specific phrase or word that you can all remember. 

Agree that everyone should ask for that password whenever a family member is in an emergency. 

You can get creative by turning it into a spy-thriller-like question-and-answer portion. 

For example, ask, “What’s for dinner?” and you must answer, “Sinigang na ube with coconut bits.” 

The key is to make sure the family password is unique and easy to remember. 

Your subscription could not be saved. Please try again.
Your subscription has been successful.

Subscribe to our daily newsletter

By providing an email address. I agree to the Terms of Use and acknowledge that I have read the Privacy Policy.

However, this method might not be advisable for the elderly because they may forget the secret code, especially in an emergency.

TOPICS: online scams
TAGS: online scams

Your subscription could not be saved. Please try again.
Your subscription has been successful.

Subscribe to our newsletter!

By providing an email address. I agree to the Terms of Use and acknowledge that I have read the Privacy Policy.

© Copyright 1997-2024 INQUIRER.net | All Rights Reserved

This is an information message

We use cookies to enhance your experience. By continuing, you agree to our use of cookies. Learn more here.