“Activity dating back to 2023 reveals malicious actors have impersonated senior U.S. state government, White House, and Cabinet-level officials, as well as members of Congress to target individuals, including officials’ family members and personal acquaintances,” the alert said.
“If you receive a message claiming to be from a current or former senior U.S. official, do not assume it is authentic.”
The threat actors have sent text messages and artificial intelligence-generated voice messages as part of their impersonation campaigns.
In the scheme, criminals contact a target by impersonating a U.S. official and briefly engage in conversation on a topic the target is well-versed in, the agency said. They then request that the communication be taken to an encrypted mobile messaging application. In most cases, the initial contact is made via SMS, and the conversation is then moved to an app such as Telegram, WhatsApp, or Signal.
Once this communication channel is set up, they inquire about events such as bilateral relations, current affairs, and trade and security policy negotiations, according to the FBI.
The malicious actor then suggests setting up a meeting between the target and the U.S. president or other high-ranking officials. They also lie that the target is being considered for nomination to the board of directors of a company.
According to the FBI, targets have been asked to take certain actions, including providing an authentication code allowing the fraudsters to sync their device with the contact list on the target’s phone, supplying personally identifiable information and copies of personal documents, and wiring funds to a financial institution located abroad under false pretenses.
The malicious actors may use voice phishing tactics, which involve audio generated by artificial intelligence to impersonate public figures or personal relations to boost the believability of the scheme.
In the latest alert, the FBI issued guidance to help identify suspicious messages. It advised people to verify the identity of the person contacting them and to listen closely to the tone and word choice during a call or voice message to assess whether it is AI-generated.
It asked people to refrain from sharing sensitive information with those they have only met online or by phone and to not send money or assets to such individuals.
Meanwhile, losses from such complaints jumped from roughly $240 million to more than $405 million during this period.
AI Impersonation, Social Security Warning
In September, the FBI and the American Bankers Association warned that AI-backed deepfake impersonation scams were getting harder to detect.
“Imposter scams in particular are on the rise. … Criminals are using deepfakes, or media that is generated or manipulated by AI, to gain your trust and scam you out of your hard-earned money,” the agency said.
Deepfake content can include altered audio, video, or images. The scammers can impersonate public figures, such as celebrities or the target’s family and friends, the FBI said.
The scam involves criminals threatening beneficiaries that their Social Security numbers will be suspended within 24 hours and that their cases will be referred for criminal prosecution, urging them to contact the OIG using the provided number.
When victims call the number, the scammer impersonates an SSA employee and requests the target’s personal information.
“Scammers continue to exploit fear and confusion by using official-looking letters and real SSA employee names to threaten you and convince you they’re legitimate so that you will respond and provide them with your personal information and money,” acting Inspector General Michelle L. Anderson said.








