Meta is aiming to help young people stop their intimate images from spreading online without their consent with a new platform dubbed Take It Down(Opens in a new window).
The site is modeled after a similar service for adults called StopNCII(Opens in a new window). It’s built in partnership with the National Center for Missing and Exploited Children (NCMEC) and funded by Meta.
“Losing control of your nude images online is terrifying, and if you’re a teen it may feel impossible to reverse it,” says a Take It Down promotional video included in Meta’s announcement(Opens in a new window). “While you can’t go back in time and unsend, we can help you move forward.”
The initiative is part of Meta’s efforts to curb “revenge porn,” or when someone posts explicit images of someone else without their consent to embarrass or bully them. It will help also prevent “sextortion,” or when someone uses online images as a threat for more images, sexual favors, or money, Meta says.
“This issue has been incredibly important to Meta for a very, very long time because the damage done is quite severe in the context of teens or adults,” Antigone Davis, Meta’s global safety director, said in an interview(Opens in a new window) with CBS. “It can do damage to their reputation and familial relationships, and puts them in a very vulnerable position. It’s important that we find tools like this to help them regain control of what can be a very difficult and devastating situation.”
To use Take It Down, young people under 18, or parents and trusted adults on their behalf, can submit a case on the site. The fact that users initiate the process is a critical difference between Apple’s failed effort to scan iCloud uploads for child sexual abuse images, which the company rolled out in 2021 and then stopped a year later after being dubbed a security risk.
As part of a three-step process, users select the images or video in question but the site will not store them like a typical upload. Instead, Take It Down assigns each one a unique numerical code called a “hash.”
(Credit: Take It Down)
“Take It Down works by assigning a unique digital fingerprint, called a hash value, to nude, partially nude, or sexually explicit images or videos of people under the age of 18,” says the site. “This all happens without the image or video ever leaving your device or anyone viewing it. Only the hash value will be provided to NCMEC.”
Recommended by Our Editors
With the hash, the system can locate the content on “participating apps,” take it down (per the site’s name), and prevent it from being re-posted. Those apps include Facebook and Instagram, at least to start, though Meta says it’s working with NCMEC to promote the service so other apps can integrate it as well.
Meta has taken several steps over the years to limit teens’ interactions with what it deems “suspicious adults.” That includes banning adults from sending messages to users under the age of 18 on Instagram, unless those teenagers follow them and defaulting kids under 16 to a private account when they sign up. Those pegged as suspicious will have limited access to teen accounts and teens will receive notifications when suspicious adults follow them, comment on their post, tag them, or re-post their content.
Instagram IDs suspicious users and asks minors ‘Do you know this person?’ when they interact with their profile.
(Credit: Meta)
(Credit: Meta)
Like What You’re Reading?
Sign up for SecurityWatch newsletter for our top privacy and security stories delivered right to your inbox.
This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.
Hits: 0