Due to carelessness and trust in loan requests from relatives, 5,000 people have fallen victim to an AI impersonation scam, losing over $11 million.
Artificial Intelligence (AI) is currently one of the most discussed technology topics. Following the sensational debut of ChatGPT, tech giants Microsoft and Google are not missing out on the AI wave with their AI-integrated Bing version and the brand-new chatbot Bard.
Despite their immense potential to answer user questions, write poetry, code, or create images from just a few lines of description, these AI tools are revealing alarming downsides. Recently, they have been exploited by scam groups to defraud unsuspecting individuals who are less tech-savvy.
Nearly Scammed by a Nephew Requesting Money
Senior couple Ruth Card (73 years old) and Greg Grace (75 years old) received a call from “nephew Brandon”, who claimed he was in jail without a wallet or phone. He needed money to post bail and be released.
Although it was an unfamiliar number, the voice on the other end sounded very much like their nephew, prompting Mrs. Card to hurriedly send money. “I was so scared that I could only think about helping my nephew immediately,” the elderly woman recounted.
The couple withdrew 3,000 CAD (2,207 USD), which was their bank account’s withdrawal limit, and then went to the next branch to withdraw more funds.
However, at that branch, a bank manager called them into the office, stating that the bank had received a similar call and discovered it was just a scammer impersonating their nephew’s voice. At that moment, Mrs. Card and Mr. Grace realized that the person on the line was not their nephew.
Everyone needs to be cautious of money requests from relatives, friends. (Photo: Wric).
“At that point, we woke up. We mistakenly believed that the person was our nephew Brandon,” Mrs. Card said.
According to the Washington Post, with the increasing number of impersonation scams, Mrs. Card’s experience has highlighted a much more sophisticated form of fraud. In 2022, impersonation scams were the most common type of fraud in the U.S., with over 36,000 reported cases.
The majority of scam groups impersonate the victims’ friends and family. Among them, 5,100 cases of fraud were conducted through phone calls, causing losses of $11 million, according to the U.S. Federal Trade Commission (FTC).
Advancements in technology have made it easier for bad actors to mimic voices, posing as distressed relatives, with victims often being older adults. With the aid of AI, numerous online tools can now imitate a person’s voice from just a normal audio file and even speak any content the user inputs.
Experts say that authorities and regulatory agencies are lagging behind and unable to keep up with this sophisticated form of fraud. As a result, most victims cannot realize they are being scammed, while police also find it challenging to trace calls or money transfer transactions because these criminal groups are dispersed globally.
How AI is Aiding Criminals
The most commonly used method by these criminals is impersonating relatives, typically children, partners, or friends of the victims, and pleading for money by claiming they are in dire circumstances.
The use of AI-generated voices enhances the professionalism and danger of this scheme. This represents the dark side of generative AI, which can create text, images, or sounds based on provided datasets.
Voice synthesis software works by analyzing features in a person’s voice, such as age, gender, and intonation, then searching a vast database for a similar voice and predicting the dialogue patterns needed to generate.
Thus, they can reproduce the pitch and tone of a person and create a voice that perfectly matches the original. This database often consists of audio clips from YouTube, TikTok, Instagram, Facebook, or podcasts and advertisement videos.
“Two years ago, you needed a lot of audio files to mimic someone’s voice. But now, as long as you have appeared on Facebook or TikTok, anyone can impersonate your voice using AI,” said Professor Hany Farid from the University of California.
Anyone can be impersonated by AI. (Photo: Security Intelligence).
EvenLabs is one of the popular AI voice simulation software with service packages ranging from free to $5 or $330 per month. This software has faced criticism for being misused to “put words in the mouths” of celebrities, merging them into videos to make them appear to say inappropriate things genuinely.
In a Twitter post, EvenLabs stated that it implements security measures to minimize abuse for malicious purposes, such as prohibiting free users from customizing voices and launching tools to detect AI-generated voices.
Lost $15,000 Due to Naivety
However, these measures are still not enough. Benjamin Perkin is a victim of this software as his parents lost thousands of dollars due to a voice scam. Their nightmare began when Perkin’s parents received a call from someone claiming to be a lawyer.
This person said their son had caused a car accident resulting in a fatality. Currently, he is in jail and needs money for legal proceedings.
The lawyer then transferred the phone to Benjamin Perkin, stating that he urgently needed money. A few hours later, the impersonated lawyer called Perkin’s parents again, demanding 21,000 CAD (15,449 USD) before going to court.
Although they sensed something was off, the couple was still worried and thought their son was in danger. “The voice was so realistic that my parents believed they were actually talking to me,” Perkin said. Consequently, his parents went to several different banks to withdraw money and sent it to the lawyer via Bitcoin transfer.
It wasn’t until the real “Perkin” called that the couple realized they had been scammed. Afterward, his family quickly reported to the police but could not recover the lost funds. “All the money just vanished. No insurance. No way to get it back. We’ve lost everything,” Perkin said.
According to Will Maxson, Deputy Director of Marketing Tricks at the Federal Trade Commission (FTC), tracing these voice scam criminals is very difficult because they can fake phone locations from anywhere. Therefore, he advises everyone to be more cautious and vigilant.
If a relative asks for money, do not immediately comply but call other family members to verify. Even if the call comes from a relative’s number, it could still be a scam. People should also refrain from sending money via gift cards or other difficult-to-trace methods and be cautious with any requests for cash, Maxson advised.