Tech »  Topic »  FBI warns of kidnapping scams as hackers turn to AI to provide 'proof of life'

FBI warns of kidnapping scams as hackers turn to AI to provide 'proof of life'


(Image credit: Shutterstock / LookerStudio)
  • FBI warns criminals are using GenAI deepfakes in kidnapping and extortion scams
  • Attackers generate fake “proof of life” videos from social media images, demanding ransom payments
  • Citizens advised to limit online exposure, set family code words, and verify loved ones before paying

Hackers are using Generative Artificial Intelligence (GenAI) to create convincing deepfake videos which are then used as proof of life in kidnapping and extortion scams.

This is according to the US Federal Bureau of Investigation (FBI) which recently released a new Public Service Announcement (PSA), warning citizens not to fall for the trick.

Here is how the scam works: the criminals will pick a target and scour social media and other sources for images and videos. If they find enough information, they will source it into an AI tool to create videos and images depicting their targets’ loved ones as kidnapped. Then, they will ...


Copyright of this story solely belongs to techradar.com . To see the full text click HERE