The days of verified Twitter users changing their names in order to impersonate celebrities, companies, and lawmakers — previously at the risk of a permanent ban — are a quaint stunt compared to what is possible now that the tools of unregulated artificial intelligence is in the hands of the bored and bad faith.
The State Department is learning this the hard way. According to a Tuesday report from The Washington Post, an individual posing as Secretary of State Marco Rubio used AI software to mimic the secretary’s voice and writing style. The individual, who has not yet been identified, contacted several foreign ministers, a member of Congress, and a U.S. governor.
Most of the attempted outreach took place via Signal, the encrypted messaging app that attained political infamy after former White House National Security Adviser Michael Waltz inadvertently added a journalist to a group chat discussing strike plans against Yemen.
According to a July 3 State Department cable obtained by the Post, the imposter sought to contact the officials “with the goal of gaining access to information or accounts.” The person used the fake email address “Marco.Rubio@state.gov” as a display name and “left voicemails on Signal for at least two targeted individuals and in one instance, sent a text message inviting the individual to communicate on Signal.”
It’s not the first time AI has been used in attempts to impersonate public officials. In May, the FBI warned of “an ongoing malicious text and voice messaging campaign” that used the emerging technology to impersonate senior U.S. officials in an attempt “to establish rapport before gaining access to personal accounts.”
In the aftermath of the 2024 election, a political consultant was charged with voter suppression and impersonation of a candidate after using AI to create a facsimile of former President Joe Biden’s voice. The call discouraged New Hampshire voters from participating in the state’s Democratic primary.
In September of last year, former Senator Ben Cardin joined what he thought was a Zoom call with former Ukrainian Diplomat Dmytro Kuleba, whom he had met in the past. When he joined the call, Cardin quickly realized the individual on the other end of the line was not actually real — but a deepfake of Kuleba looking to ask “politically charged questions in relation to the upcoming election.”
There has yet to be a major diplomatic or political scandal in the United States associated with undetected AI impersonations, but at this point it feels like it’s only a matter of time.