FBI issues a warning to all smartphone users about an AI call that raids banks and steals millions of dollars

Join For Personal Benefits News

Thousands of accusations of grandparent fraud are presently being looked into by the Federal Bureau of Investigation (FBI).

Grandparent scams typically include a dangerous actor approaching the victim and impersonating their relatives or close friends.

They subsequently demand bail money while acting as though they have run short of the law.

Yet as artificial intelligence (AI) advances, fraud has grown considerably riskier.

Criminals are now tricking older adults into thinking they are speaking to a relative by utilizing a social engineering technique called AI voice clone.

An opponent chooses a victim, searches social media for brief video footage of their voice, and then copies it in a vocal duplicate attack.

Attackers have also begun using phony attorneys to further influence victims during phone calls.

Spoofing is the practice of bad actors who change a victim’s caller ID in order to make a call seem to be coming from a reliable source.


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *