FBI Reports AI The Latest Tool in Sextortion Cases

FBI Reports AI The Latest Tool in Sextortion Cases

Sextortion isn’t a new phenomenon. For decades, criminals have threatened people (mostly women) with exposure over sexual compromising or explicit photos of themselves online. Sextortion could involves payments in money or more photos of the victim. Sometimes, the images possessed by these criminals are real, other times they could be fake when the mere threat of exposure is enough to compromise victims into paying a ransom.

AI Deepfakes Proliferate Sextortion

On Monday, the FBI issued an alert that explicit, AI generated videos of victims are frequently being used to extort money or other payments.

As of April 2023, the FBI has observed an uptick in sextortion victims reporting the use of fake images or videos created from content posted on their social media sites or web postings, provided to the malicious actor upon request, or captured during video chats. Based on recent victim reporting, the malicious actors typically demanded: 1. Payment (e.g., money, gift cards) with threats to share the images or videos with family members or social media friends if funds were not received; or 2. The victim send real sexually themed images or videos.

ic3.gov

Criminals create these deepfake videos by swiping images from a victim’s social media account and using online tools to generate explicit videos.Criminals then send the content directly to the victim, or post videos where they can be discovered by the victim.

If you suspect being the victim of a crime, the FBI encourages you to contact the appropriate authorities.

It’s hard being a woman on the Internet. Be safe out there.

-MJ