Artificial intelligence has brought incredible tools to our daily lives—but it’s also opened the door to new and dangerous scams, especially for older adults. One of the fastest-growing threats in 2025? Deepfakes—AI-generated videos, audio clips, and images that mimic real people with alarming accuracy.
From fake phone calls that sound like your grandchild to convincing video messages from “trusted” figures, deepfake scams targeting seniors are on the rise. In this guide, you’ll learn how to spot deepfakes, protect yourself and your loved ones, and stay confident in today’s digital world.
What Are Deepfakes, and Why Are They Dangerous?
Deepfakes are AI-generated videos or audio recordings designed to look and sound like real people—even if those people never said or did what’s shown.
They’re often used to:
- Impersonate loved ones (e.g., “Grandma, I’m in trouble, please send money!”)
- Fake public officials or celebrities asking for donations
- Mimic banks or Medicare reps asking for sensitive info
- Trick people into sending money or giving up passwords

Why Seniors Are Often Targeted
Scammers know that older adults are:
- Less familiar with AI-generated technology
- More likely to respond to family-related emergencies
- Often generous and trusting
- Using phones and email more frequently—but not always with digital security habits
Deepfake scams are designed to exploit emotion, urgency, and confusion.
How to Spot a Deepfake (Even a Really Good One)
1. Unexpected Requests from Family
If your child or grandchild “calls” and asks for money urgently:
- Hang up and call them back on a trusted number
- Deepfakes can replicate voices using just a few seconds of audio
- Beware of calls that say, “Don’t tell anyone” or “I lost my phone—just send it here”
2. Strange Eye or Lip Movement in Videos
In many deepfakes, the person’s:
- Eyes don’t blink normally
- Mouth movements don’t quite match the audio
- Voice sounds slightly robotic or off in tone
3. Pressure to Act Quickly
Deepfake scams often include urgent messages like:
- “You must respond now!”
- “Wire money immediately”
- “Click this link to secure your account”
Real businesses or loved ones will never mind you taking time to verify.

Common Deepfake Scam Types in 2025
1. Voice Cloning Scams
You get a call from your “grandchild” saying they’ve been arrested or in an accident and need money. But it’s a fake voice cloned from social media or voicemails.
2. Fake CEO or Government Calls
A “Medicare agent,” “IRS rep,” or “company CEO” calls with urgent requests—complete with familiar voices or even video. Don’t trust voice alone.
3. Romance Scams with Video
Deepfakes are now being used in online dating scams—where a scammer sends fake videos of a “real” person to build trust before asking for money.
4. Phony Charity Appeals
A “celebrity” sends a video asking you to donate to a cause. If it seems too perfectly personalized, it could be fake.
Smart Ways to Protect Yourself Online
1. Verify Everything
- Always call a known number or contact directly before sending money or clicking a link
- Ask personal questions that only your real family would know
2. Use Two-Factor Authentication
Secure your email, bank, and social media accounts with 2-step logins to prevent hackers from stealing your info.
3. Update Your Privacy Settings
Limit what strangers can see (and hear) on your Facebook, Instagram, or other social accounts. The less public audio/video of you and your family, the safer you are.
4. Install Call and Email Filters
Use spam filters, antivirus software, and call blockers. Services like Nomorobo, Hiya, or your phone carrier can flag suspicious calls.

FAQs
How can scammers get my family’s voice or video?
They may scrape audio from social media, YouTube, voicemail, or past recordings. A few seconds is often enough for AI to clone a voice.
What should I do if I think I’ve been scammed?
- Stop communication immediately
- Contact your bank or credit card company
- Report it to the FTC at reportfraud.ftc.gov
- Tell your family—you’re not alone
Is it safe to answer unknown calls or messages?
In 2025, it’s best to let unknown numbers go to voicemail, then listen carefully before responding. Use caution with text links or videos from unfamiliar senders.
Final Thought: Don’t Let Fear Win—Let Knowledge Empower You
Deepfakes are clever, convincing, and getting better—but with awareness, skepticism, and simple digital habits, you can protect yourself and your loved ones.
When in doubt, pause, verify, and talk to someone you trust. The scammer’s greatest weapon is urgency. Your greatest defense? Calm, confident action.