AARP Hearing Center
Key takeaways
- Criminals take advantage of easy-to-use, inexpensive AI tools to create convincing deepfake videos, cloned voices and messages to steal from victims.
- Reports of AI-enabled scams have surged, with many older adults targeted by impostor schemes and other forms of fraud.
- Experts urge slowing down, verifying independently, and carefully guarding personal information to protect yourself from rapidly evolving fraud tactics.
Last September, Dr. David Amron watched a Facebook video of himself with growing horror. A recognized specialist in lipedema surgery, he saw and heard himself hawking a $50 “miracle” cream for this painful, incurable condition in which fat accumulates under the skin.
But he had never made the video or endorsed the product. It was a deepfake scam, cooked up by criminals using artificial intelligence (AI). And it was so convincing that some of his own patients bought the cream.
“The video was disturbingly realistic,” says Amron, director of the Roxbury Institute in Beverly Hills, California. “My reaction was disbelief, anger and genuine concern for my patients. What unsettled me most was how authentic it appeared.”
Amron wasn’t the only one deepfaked; the video also featured Oprah Winfrey, Kelly Clarkson and the institute’s research director. Amron thinks criminals digitally altered legit online videos of his work, then used a photo of the researcher to create realistic audio and video. “This level of realism is exactly why these scams are so dangerous,” he says. “They are engineered to be believable, making it easy for vulnerable patients to trust them.”
More sophisticated AI-enabled scams
AI-enabled scams are skyrocketing, the Federal Bureau of Investigation (FBI) warns. They include deepfake videos on social media and cloned voices on the phone as well as impostor websites and phishing emails and text messages. These increasingly sophisticated AI scams often put older adults in the crosshairs, according to a December 2025 Microsoft study of fraud data from AARP and the Better Business Bureau.
At risk: your money, your personal information and your health. “We’re getting deluged,” says Bob Sullivan, host of AARP’s The Perfect Scam podcast. “A couple of years ago, you might have encountered one or two AI-generated scams a year. Now scammer call centers are sending out tens of thousands of scam messages per minute.”
Join Our Fight Against Fraud
Here’s what you can do to help protect people 50 and older from scams and fraud:
- Tell lawmakers to stop criminals from using crypto kiosks to steal from us.
- Sign up to become a digital fraud fighter to help raise awareness about the latest scams.
- Read more about how we’re fighting for you every day in Congress and across the country.
- AARP is your fierce defender on the issues that matter to people 50-plus. Become a member or renew your membership today.
Nearly 9 in 10 older adults said in a recent AARP poll that they’re worried about AI-enabled scams, according to the 2025 University of Michigan National Poll on Healthy Aging.
For good reason: AI has transformed the business of fraud. Half of all spam emails are now generated with AI tools, according to a 2025 Columbia University study.
“AI doesn’t sleep,” says Vijay Balasubramaniyan, CEO and cofounder of the cybersecurity company Pindrop. “It’s cheap. It works 24/7.” And it works well. Criminals are deploying free or low-cost AI tools like ChatGPT and Sora — the same ones the rest of us use for web searches and to turn photos into fun videos — as well as underworld versions with names like FraudGPT, SpamGPT and Xanthorox. “AI is accelerating how scams are created and scaled,” says Teresa Hutson, corporate vice president of Microsoft’s Trusted Technology Group.
Deepfake videos, cloned voices and chatbots that can hold realistic conversations via text, email or phone are a snap to produce. “Eight years ago, it took 20 hours of recordings to clone someone’s voice for a scam,” says Balasubramaniyan. “Now, with a photo from LinkedIn and three seconds of your voice, a scammer can create a deepfake video with audio.”
Scammers like it so much, they’re replacing their employees at scammer call centers with AI systems, Balasubramaniyan discovered. His team had been eavesdropping on a West African scammer call center for years. In 2024, they stopped hearing the familiar voices of 12 call-center employees. AI-generated voices had taken over.
When researchers in Microsoft’s AI for Good Lab analyzed 531,000 fraud reports from AARP ’s Fraud Watch Network Helpline and the Better Business Bureau’s Scam Tracker, they found a disturbing trend: Scams that victims identified as AI-enabled — such as with realistic voices or videos — increased 20-fold from 2023 to 2025. The increase aligns with the arrival of AI, says Lisa Reppell, a report coauthor and senior program manager for information literacy at AI for Good.
More From AARP
Scammers Pose As Publishers Clearing House
Victims have lost life savings to criminals promising winnings
Indiana Is First State to Ban Crypto Kiosks
Citing their heavy use in scams, Indiana outlaws crypto kiosks that look like regular ATMs
Sunshine, Golf, Pickleball and… Scams?
Residents in retirement communities are often targeted for fraud