FBI Warning: AI scammers aided by deepfake technology are targeting U.S. government officials

robot
Abstract generation in progress

Source: Cointelegraph Original: FBI Warning: Deepfake-Assisted AI Scammers Target U.S. Government Dignitaries

Hackers assisted by deepfake technology are now conducting the latest sham phishing activities targeting federal and state officials in the United States by impersonating senior U.S. officials, attempting to steal sensitive data.

According to a warning from the FBI on May 15, these malicious actors have been active since April, using deepfake voice messages and texts to impersonate senior government officials and build trust with victims.

"If you receive a message purporting to be from a high-ranking U.S. official, don't assume it's genuine," the agency said. ”

The FBI also pointed out that if the accounts of U.S. officials are compromised, the scams could become even more serious, as hackers could then "exploit the trusted contact information they obtain to target other government officials or their associates and contacts."

As part of these scams, the FBI stated that hackers are attempting to access victims' accounts through malicious links and redirect them to platforms or websites controlled by the hackers, thereby stealing sensitive data such as passwords.

The agency added: "Contact information obtained through social engineering techniques may also be used to impersonate contacts in order to obtain information or funds."

In another unrelated deepfake scam, Sandeep Nailwal, co-founder of the blockchain platform Polygon, issued a warning in an X post on May 13, stating that bad actors are also using deepfake technology to impersonate him.

Nailwal said the "frightening way of attacking" made him a little uneasy, as several people "called me on Telegram and asked if I was on a Zoom call with them and if I asked them to install a script".

According to Nailwal, as part of a scam, malicious actors hacked the Telegram account of Polygon's venture capital leader Shreyansh and contacted people to join a Zoom call that included deepfake videos of Nailwal, Shreyansh, and a third person.

"The audio is disabled, and because your voice doesn't work, the scammer will ask you to install an SDK, and if you do, it's over," Nailwal said. ”

He also mentioned: "Another issue is that it is impossible to report this matter to Telegram and get their attention. I understand that they cannot handle all these service requests, but there should be a way, perhaps through some social means, to call out a specific account."

At least one user replied in the comments that scammers have also targeted them, while Web3 veteran Dovey Wan stated that she has also been deepfaked in similar scams.

Nailwal suggests that the best way to avoid being deceived by such scams is to never install anything in online interactions initiated by others and to reserve a separate device specifically for accessing cryptocurrency wallets.

At the same time, the FBI recommends verifying the identity of anyone who contacts you, checking all sender addresses for errors or inconsistencies, and looking for distorted hands, feet, or unrealistic facial features in all images and videos.

At the same time, the agency also recommends never sharing sensitive information with someone you've never met, never clicking on links from people you don't know, and setting up two-factor or multi-factor authentication.

Related reports: A report states that Tether's delay in enforcing the USDT blacklist has allowed $78 million in suspicious funds to evade freezing.

View Original
The content is for reference only, not a solicitation or offer. No investment, tax, or legal advice provided. See Disclaimer for more risks disclosure.
  • Reward
  • Comment
  • Share
Comment
0/400
No comments