The FBI’s San Francisco office warned the public earlier this week against the “escalating threat” of cyber criminals using AI technology for phishing and scams. In a statement Wednesday, the FBI said attackers have taken advantage of AI capabilities for phishing campaigns and scams targeting individuals and businesses.
“These AI-driven phishing attacks are characterized by their ability to craft convincing messages tailored to specific recipients and containing proper grammar and spelling, increasing the likelihood of successful deception and data theft,” the bureau said.
The FBI also warned the public of AI-powered voice and video cloning techniques that can impersonate individuals trusted by potential victims, leading to unsuspecting victims divulging sensitive information or approving fraudulent transactions.
“As technology continues to evolve, so do cybercriminals’ tactics. Attackers are leveraging AI to craft highly convincing voice or video messages and emails to enable fraud schemes against individuals and businesses alike,” FBI Special Agent in Charge Robert Tripp said. “These sophisticated tactics can result in devastating financial losses, reputational damage, and compromise of sensitive data.”
The FBI advised the public to be aware of urgent messages asking for money or credentials. Businesses were also called on to “explore various technical solutions to reduce the number of phishing and social engineering emails and text messages” that may affect their workers.
“Additionally, businesses should combine this technology with regular employee education and employees about the dangers of phishing and social engineering attacks and the importance of verifying the authenticity of digital communications, especially those requesting sensitive information or financial transactions,” the bureau said.
The FBI also recommended using multi-factor authentication solutions can also give additional security, making it more difficult for cybercriminals to access illegally accounts and systems of their targeted victims.