FBI: Scammers Using Public Photos, Videos for Deepfake Extortion Schemes

The FBI is warning that scammers are using AI technology to create sexually explicit deepfake photos and videos of people in a bid to extort money from them, also known as “sextortion.”

The threat is particularly disturbing because it exploits the benign photos people post on their social media accounts, which are often public. Thanks to advancements in image- and video-editing software, a bad actor can take the same photos and use them to create AI-generated porn with the victim’s face.

“The FBI continues to receive reports from victims, including minor children and non-consenting adults, whose photos or videos were altered into explicit content,” the agency said(Opens in a new window) in the alert. “The photos or videos are then publicly circulated on social media or pornographic websites, for the purpose of harassing victims or sextortion schemes.”

As a result, the FBI is warning the public about the danger of posting photos and videos of themselves online. “Although seemingly innocuous when posted or shared, the images and videos can provide malicious actors an abundant supply of content to exploit for criminal activity.”

The FBI did not say how many complaints it has received. But the agency issued the alert as it’s seen thousands of sextortion schemes targeting minors. This can involve an online predator pretending to be an attractive girl and then duping a teenage boy into sending them nudes. The scammer will then threaten to post the nudes online unless money is paid. 

Recommended by Our Editors

In today’s alert, the FBI noted recent sextortion schemes have also involved the use of deepfakes. “As of April 2023, the FBI has observed an uptick in sextortion victims reporting the use of fake images or videos created from content posted on their social media sites or web postings, provided to the malicious actor upon request, or captured during video chats,” the agency said. In some cases, the predators will also use the deepfakes to pressure a victim into sending them “real sexually-themed images or videos.”

In the meantime, the rise of malicious deepfakes could cause more states to outlaw their use. Only a few states, such as Virginia and California, have banned deepfake porn. But last month, Rep. Joe Morelle (D-NY) introduced federal legislation(Opens in a new window) to ban non-consensual deepfakes, turning them into a criminal offense.

SecurityWatch newsletter for our top privacy and security stories delivered right to your inbox.”,”first_published_at”:”2021-09-30T21:22:09.000000Z”,”published_at”:”2022-03-24T14:57:33.000000Z”,”last_published_at”:”2022-03-24T14:57:28.000000Z”,”created_at”:null,”updated_at”:”2022-03-24T14:57:33.000000Z”})” x-show=”showEmailSignUp()” class=”rounded bg-gray-lightest text-center md:px-32 md:py-8 p-4 mt-8 container-xs” readability=”31.423799582463″>

Like What You’re Reading?

Sign up for SecurityWatch newsletter for our top privacy and security stories delivered right to your inbox.

This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.

Facebook Comments Box

Hits: 0