As Artificial Intelligence (AI) continues to evolve, the risk of scams has increased for older adults. Scammers typically target an older demographic due to a lifetime of wealth, unfamiliarity with technology and more. Deepfake technology is contributing to the rise in scams. Deepfakes are AI-generated videos, photos and audio that fabricate or alter someone’s likeness to a realistic degree.

Below are common deepfake scams, signs to detect them and steps to take if you’ve been impacted.

Common deepfake scams

Tech support scams – Scammers will deploy pop-up warnings, emails, or text messages indicating a security risk to your device, then disguise themselves as tech repair workers just to gain access to your technology.

Investment scams – Scammers may impersonate financial experts by promoting fake investment opportunities. Once you “invest” your money, the scammers will disappear without a trace.

Romance scams – With AI, scammers can create convincing false personas by using face-swapping tools and AI-altered photos. The scammers will nurture the relationship as long as they need to before they gain your trust, then they will ask for money.

Government or family imposter scams – Scammers can use deepfake technology to falsify their voice to sound like or impersonate a political figure, law enforcement agency or a family member. They can call as a political figure asking for donations, a law enforcement official claiming a payment is needed to avoid jail time or a family member needing money urgently.

Lottery or sweepstakes scams – Scammers will contact you claiming you’ve won a lottery or sweepstakes you didn’t enter and insist on payment for a false tax or fee in order to claim your winnings.  

Deepfake signs to detect:

  • Strange body or facial movements
  • Unnatural reflections or shadows
  • Audio between the voice and mouth are out of sync
  • Jagged or pixelated images
  • Clothing color changes throughout clip
  • Glossy or “too perfect” look
  • Robot tone of voice with no background noise

If you have been impacted by a deepfake scam, here are a few immediate steps to take:

  1. Immediately stop all communication with the scammer, including phone calls, emails and messages.
  2. Contact the Federal Trade Commission (FTC) to file a complaint online* or by phone at 877-FTC-HELP (877-382-4357).
  3. File a complaint with the FBI through the Internet Crime Complaint Center* (IC3).
  4. Notify the financial institutions you work with. This includes your bank, investment companies, insurance companies, and brokerage firms.
  5. File a police report with your local police department.
  6. Change all your internet banking passwords.

*Please Note: There are external links included in this article that will take you to a website Bankers Trust does not control. Bankers Trust has provided these links for your convenience but does not endorse and is not responsible for the content, links, privacy policy, or security policy of external websites.

Amy Berger is AVP, Fraud and Security Supervisor at Bankers Trust. She joined the bank in 2012 and has held various roles in our branches before joining the Financial Intelligence team. Amy’s work focuses on preventing fraud, protecting physical security, and business continuity. She holds her Certified Community Bank Security Officer (CCBSO), Certified AML and Fraud Professional (CAFP), Associate Business Continuity Professional (ABCP), and Certified Banking Business Continuity Professional (CBBCP) designations.

You may also be interested in…