AARP Eye Center
A Houston-area couple received a call last month from their adult son — or at least they thought it was him: The voice sounded exactly like him. He said he’d been in a car accident where he hit a woman who was six-months pregnant, had just been released from the hospital, and now was in the county jail about to be charged with DWI, according to a KHOU 11 News report. He needed $5,000 to get himself out of this mess.
Absolutely convinced that the caller was their child, they handed over the cash to an intermediary who came to their home pick it up.
How’d the scammers pull it off? Most likely by using artificial intelligence (AI) to clone the son’s voice, says Alex Hamerstone, a cyber analyst at TrustedSec, an information security consulting company. He points to the case in Houston as one vivid example of how the latest generative AI technology, including voice-mimicking software, deepfake videos, and chatbots like ChatGPT can be used by bad actors.
They have the potential to level up criminals’ ability to impersonate anyone — your grandchild, a police officer, even your spouse — Hamerstone notes.
“There are a lot of scams out there that are pretty worrisome, and I always try to kind of temper that a little bit,” he says, “but this one really does scare me.”
Other experts are also concerned about AI’s potential for harm. Last month, a group of tech leaders, including Elon Musk and Apple cofounder Steve Wozniak, posted an open letter online warning that “AI systems with human-competitive intelligence can pose profound risks to society and humanity,” and calling for a six month pause in the training of AI systems, so experts can take time to develop and implement “a set of shared safety protocols.”
A game-changer for impostor scams
The scheme described above — a version of the “grandparent scam,” where grandparents are targeted by criminals pretending to be grandchildren in crisis — is common, “but before [the use of this software] the voice could have been a giveaway,” says Steve Weisman, a professor of white-collar crime at Bentley University in Waltham, Massachusetts, and an expert in scams, identity theft and cybersecurity.
With voice-cloning tech, he adds, scammers need to capture only a few seconds of the child's audio, “which they can get from a TikTok video or an Instagram video or anything like that,” to offer a convincing impersonation.
In another widely reported incident this month, an Arizona woman named Jennifer DeStefano said she received a call from what sounded like her 15-year-old daughter, Briana, who was on a ski trip, crying and claiming that she was being held by kidnappers demanding ransom money. DeStefano confirmed that her daughter was safe, but was shaken that the voice sounded exactly like Briana’s. She attributed the uncanny likeness to AI.
And anyone can use this technology:
“It’s just like downloading any other app,” Hamerstone says. “If you were recording this conversation, you could feed it into the software and type out whatever you want me to say, and it would play my voice saying that.”