By Naveen Athrappully
Increasingly hard-to-detect deepfake content created with artificial intelligence is being exploited by criminals to impersonate trusted individuals, the FBI and the American Bankers Association (ABA) said in a report published on Sept. 3.
In its “Deepfake Media Scams” infographic, the FBI said that scams targeting Americans are surging. Since 2020, the agency has received more than 4.2 million reports of fraud, amounting to $50.5 billion in losses. “Imposter scams in particular are on the rise. … Criminals are using deepfakes, or media that is generated or manipulated by AI, to gain your trust and scam you out of your hard-earned money.”
Deepfake content can include altered images, audio, or video. Scammers may pose as family, friends, or public figures, including celebrities, law enforcement, and government officials, the FBI warned.
“Deepfakes are becoming increasingly sophisticated and harder to detect,” said Sam Kunjukunju, vice president of consumer education for the ABA Foundation.
According to the infographic, certain inconsistencies in the AI-generated material can help detect deepfakes.
When it comes to images or videos, people should watch out for blurred or distorted faces; unnatural shadows or lighting; whether audio and video are out of sync; whether the teeth and hair look real; and whether the person blinks too little or too much. In the case of audio, people should listen closely to determine if the tone of voice is too flat or unnatural.
The infographic listed three red flags of a deepfake scam: unexpected requests for money or personal information; emotional manipulation involving urgency or fear; and uncharacteristic communication from what appears to be a known individual.
To remain safe, the ABA and FBI advised Americans to think before responding to emotional or urgent requests, and to create code words or phrases to confirm the identities of loved ones.
“The FBI continues to see a troubling rise in fraud reports involving deepfake media,” said Jose Perez, assistant director of the FBI’s Criminal Investigative Division.
“Educating the public about this emerging threat is key to preventing these scams and minimizing their impact. We encourage consumers to stay informed and share what they learn with friends and family so they can spot deepfakes before they do any harm.”
According to an Aug. 6 report by cybersecurity company Group-IB, the global economic impact of losses from deepfake-enabled fraud is estimated to reach $40 billion by 2027.
“Stolen money is almost never recovered: Due to rapid laundering through money‑mule chains and crypto mixers, fewer than 5 percent of funds lost to sophisticated vishing scams are ever recovered,” it said.
Vishing, a short form of voice phishing, refers to scammers impersonating authority figures such as government officials, tech support personnel, and bank employees to dupe targets and steal money.
According to Group-IB, deepfake vishing relies heavily on emotional manipulation tactics. Targets of such scams include corporate executives and financial employees.
Elderly and emotionally distressed individuals are also vulnerable to deepfake vishing tactics due to their limited digital literacy and unfamiliarity with artificial voice tech, Group-IB added. As such, scams involving impersonation of familiar-sounding voices may have a bigger impact on these individuals.
In June, a deepfake scam incident came to light involving a Canadian man in his 80s losing more than $15,000 in a scheme that used a deepfake of Ontario Premier Doug Ford.
In the scam, Ford was depicted promoting a mutual fund account, which the victim saw via a Facebook ad. When the victim clicked on the ad, a chat opened up, ultimately convincing him to invest the money.
In June, Sen. Jon Husted (R-Ohio) introduced the bipartisan Preventing Deep Fake Scams Act, which aims to tackle the threat posed by such fraud.
The bill seeks to address AI-assisted data and identity theft or fraud by setting up an AI-focused task force in the financial sector.
“Scammers are using deep fakes to impersonate victims’ family members in order to steal their money,” Husted said.
“As fraudsters continue to scheme, we need to make sure we utilize AI so that we can better protect innocent Americans and prevent these scams from happening in the first place. My bill would protect Ohio’s seniors, families and small business owners from malicious actors who take advantage of their compassion.”