When unsending isn't an option —

Teens can proactively block their nude images from Instagram, OnlyFans

Hundreds already using tool, as teen financial sextortion cases are increasing.

Teens can proactively block their nude images from Instagram, OnlyFans

Over the past few years, the National Center for Missing and Exploited Children (NCMEC) saw worrying trends indicating that teen sextortion is on the rise online and, in extreme cases, leads to suicides. Between 2019 and 2021, the number of sextortion cases reported on NCMEC’s online tipline more than doubled. At the start of 2022, nearly 80 percent of those cases involved teens suffering financial sextortion—pressured to send cash or gift cards or else see their sexualized images spread online.

NCMEC already manages a database that works to stop the spread of child sexual abuse materials (CSAM), but that tool wouldn't work for confused teens ashamed of struggling with sextortion, because it gathers information with every report that is not anonymized. Teens escaping sextortion needed a different kind of tool, NCMEC realized, one that removed all shame from the reporting process and worked more proactively, allowing minors to anonymously report sextortion before any of their images are ever circulated online.

Today, NCMEC officially launched that tool—Take It Down. Since its soft launch in December, already more than 200 people have used it to block uploads or remove images of minors shared online, NCMEC’s communications and brand vice president, Gavin Portnoy, told Ars.

To use Take It Down, anyone—minors, parents, concerned parties, or adults concerned about their own underage images being posted online—can anonymously access the platform on NCMEC’s site. Take It Down will then generate a hash that represents images or videos reported by users as sexualizing minors, including images with nudity, partial nudity, or sexualized poses. From there, any online platform that has partnered with the initiative will automatically block uploads or remove content matching that hash.

For NCMEC, it’s perhaps the most proactive measure provided yet to minors to limit the spread of this traumatizing content, and it can help teens avoid bullying, as well as benefit survivors of sextortion, human trafficking, and revenge porn, Portnoy told Ars.

Take It Down’s message to minors is: “You can't go back and unsend, but we can help you move forward,” Portnoy said—without feeling shamed for any role you possibly played in sharing images. Teens “don't have to provide any other information if they don't want to. It's as simple as saying, ‘hey, I think this thing is out there, it can be really damaging to me, please, please take it down.’”

Meta, Pornhub help with launch

Helping to fund its launch, Meta is among Take It Down’s biggest partners, announcing in a blog that teens can now use the platform to block uploads and remove content on Facebook and Instagram.

“Having a personal intimate image shared with others can be scary and overwhelming, especially for young people,” Antigone Davis, Meta’s global head of safety, wrote. “It can feel even worse when someone tries to use those images as a threat for additional images, sexual contact, or money.”

Davis said that soon Facebook and Instagram users would be able to connect to the Take It Down platform from within the apps. Meanwhile, this type of functionality is already being provided by another Take It Down partner, the social networking app Yubo.

Yubo co-founder and CEO Sacha Lazimi told Ars that users who find nonconsensual images online “can now choose the option to report an 'inappropriate photo/video of me,' which immediately directs them to NCMEC’s Take It Down site, where they can securely submit anonymous reports.”

Teens can also use Take It Down to remove nonconsensual content on Pornhub and OnlyFans, additional founding partners. Portnoy told Ars that NCMEC expects other online platforms to join the Take It Down initiative soon.

Participating in Take It Down, Portnoy said, is different from other databases for reporting CSAM or adult revenge porn because it requires platforms to opt in to participating for Take It Down to flag and remove content. Having Meta involved at launch, though, may influence other major social platforms like TikTok or Twitter to get involved.

OnlyFans Chief Strategy and Operations Officer Keily Blair told Ars that OnlyFans sees joining Take It Down as part of enforcing its zero-tolerance policy for sharing minors’ intimate images, saying, “We believe that platforms have a responsibility to protect children online.” A spokesperson for Pornhub-owner MindGeek told Ars, “We encourage all image-sharing platforms to follow our lead and participate in Take it Down.”

Portnoy said that the risk to minors has recently expanded from simpler cases of kids bullying each other over compromising images to more frequently seeing cases of financial sextortion—particularly impacting teen boys who can be scammed or catfished into sharing compromising content. Launching Take It Down now is a big part of NCMEC’s plan to reverse that trend by equipping minors to prevent sextortion anywhere teens can be exploited online.

“Kids are freaking out over this,” Portnoy told Ars.

Channel Ars Technica