FBI’s Warning: How to Protect Yourself from AI-powered Schemes

Stuck on Windows 10? Here Are Your Options After Support Ends

FBI’s Warning: How to Protect Yourself from AI-powered Schemes

Editor’s note: In recognition of National Cybersecurity Awareness Month this October, we are publishing a series of blog posts dedicated to educating and informing you about cybersecurity practices. This is the first in a series of posts. Below, you can find a list of links to the rest of the series:

  • Stuck on Windows 10? Here Are Your Options After Support Ends. (Posting Thursday)
  • Job Scam Texts are On the Rise: Here are 5 Red Flags to Watch Out For (Posting Monday)
  • Balance in Cybersecurity: Lock the Doors Before Boarding the Windows (Posting Tuesday)
  • What is a Vishing Scam and How Do I Protect Myself? (Posting Oct. 8)
  • The CIA of Data Security: What It Means and Why It Matters (Posting Oct. 13)
  • BYOD for Smartphones: Balancing Security, Privacy and Cost (Posting Oct. 15)

Imagine answering a phone call from a loved one in crisis — only it’s not your loved one but a voice clone that sounds eerily similar.

Cybercriminals created that voice clone using generative AI, and the FBI is warning people as these attackers find new and inventive ways to trick people into sending them money or other personal information.

On top of voice cloning, AI can create text that can mimic any type of voice or tone, as well as create photos and videos of anyone doing anything imaginable.

“Generative AI reduces the time and effort criminals must expend to deceive their targets,” the FBI said in a public service announcement.

The PSA shared some of the more common scams, as well as suggestions for identifying and defending against those scams.

Common AI Scams

Voice cloning is just one of the many ways cybercriminals utilize AI to commit fraud.

Here are some of the more common scams the FBI has discovered:

  • Using AI-generated text, images and videos to create realistic-looking social media profiles for social engineering, spear phishing, romance scams and investment schemes.
  • Using AI-generated images and videos of natural disasters to elicit donations for fake charities.
  • Using AI-generated images and videos of celebrities or social media influencers to promote counterfeit products or nondelivery schemes.
  • Using AI-generated images and videos to impersonate law enforcement, company executives or other authority figures to solicit payments or information.
  • Using AI-generated images to create pornographic photos of a victim to demand money in sextortion schemes.

How to Protect Yourself

The scary part is AI can be very convincing if you aren’t careful and vigilant. These kinds of scams rely on you to let your guard down and not critically think about what is being asked of you.

Scammers often target multiple employees in an organization, hoping just one of them falls for the scam. The best way to combat these scams is to implement phishing training and take it seriously.

While cybercriminals will continue to improve their scams as AI evolves, the FBI proposed some basic suggestions that should help you in most situations:

  • Create a secret word or phrase with your family and friends to verify their identity.
  • Look for subtle imperfections in images and videos, such as distorted hands or feet, unrealistic teeth or eyes, indistinct or irregular faces, inaccurate shadows, watermarks and unrealistic movements.
  • During a phone call, listen closely to the tone and word choice to identify an AI-generated vocal cloning.
  • If possible, limit the online content of your image or voice, make social media accounts private and limit followers to people you know
  • Verify the caller’s identity by hanging up the phone, researching the contact of the organization calling you and calling the phone number directly.
  • Never share sensitive information with people you have met only online or over the phone.
  • Do not send money, gift cards, cryptocurrency or other assets to strangers.

What Should I Do If I’ve Been Scammed?

If you believe you have been a victim of financial fraud, the FBI said you should file a report with the agency’s Internet Complaint Center.

You’ll want to be sure to include:

  • Name, phone number, address and email address of the person who contacted you.
  • Financial transaction information, such as payment amount, date, type of payment, how the payment was issued, and names and addresses of financial organizations.
  • Describe the interaction with the scammer, including how contact was made, what communication method was used, the reason for the money request, the information you provided to the scammer and other details.

Train Your Employees to Spot Scams

Scammers often target multiple employees in an organization, hoping just one of them falls for the scam. The best way to combat these scams is to implement phishing training and take it seriously.

Contact us now to schedule a meeting to discuss what phishing training entails and how it can safeguard your organization with just one training session per quarter.

Stay updated! Get tips and insights delivered to your inbox weekly by subscribing to our newsletter.

Share this post