‘Take It Down’ empowers underage victims

I’m sure we have all heard that the internet is forever and we should be wary of our digital footprints. Nevertheless, teens do not seem to think about this message before sending out explicit photos of themselves. Once a picture has been sent, it can never be taken back.

However, thanks to the National Center for Missing and Exploited Children (NCMEC), there is now a process that can help those who have sent nude pictures get them taken down from wherever they have ended up. Literally called “Take It Down,” the NCMEC states that it “allows users from around the world to submit a report that can help remove online nude, partially nude, or sexually explicit photos and videos depicting a child under 18 years old.”

The process is entirely free and anonymous, and Facebook, Instagram, Pornhub, Mindgeek, OnlyFans and Yubo are participating. The “Take It Down” website offers resources on how to find one’s nude photos, how to contact the platform, tiplines and other services like frequently asked questions and places for support.

This is an incredible tool for several reasons. In my opinion, the smartest part is the anonymity. Teens definitely do not want to get law enforcement involved when dealing with explicit photos, and “Take It Down” has proven to help with this issue. According to ABC News, “The nonprofit’s CyberTipline received 29.3 million reports in 2021, up 35% from 2020.” One can assume that the reports have only grown from here amongst its growing popularity. 

It is true that law enforcement may not be understanding of certain situations, and teens may not feel comfortable confiding in an authority figure that they were coerced into sending explicit photos — because nudes are not always consensual.

It is not uncommon for people to threaten others into sending or sharing these types of photos. That is another entire problem, but at least now, thanks to the NCMEC, there is a way to take back what has been sent.

With technology on the rise as well, what is known as a “deepfake” is now surfacing, which is an “artificial intelligence-generated image” that is “created to look like real, actual people saying or doing things they didn’t actually do,” as described by ABC News.

People have the ability to create deepfake pornography of whoever they want, which is not only disgusting but completely unethical. Popular content creator Atrioc was recently under fire when fans noticed he had deepfake pornography of other female content creators, some of whom he was friends with.

One victim of this, streamer QTCinderella, went live in tears while explaining her feelings, stating on Twitch, “the constant exploitation and objectification of women…[is] exhausting,” and she is completely right. Unfortunately, she is 28 years old, meaning she cannot participate in “Take It Down.” She does plan on suing whoever was behind the images though.

It is true that law enforcement may not be understanding of certain situations, and teens may not feel comfortable confiding in an authority figure that they were coerced into sending explicit photos — because nudes are not always consensual.

Even with this one flaw of only helping those who have images of themselves under the age of 18, I still believe “Take It Down” is revolutionary. This is a huge and arguably vital step in the right direction towards putting an end to the exploitation of children and teens. The first step in the process of getting photos taken down is selecting whether or not the photos are of them under or over the age of 18, and if they select over, they are directed to another program called StopNCII.org.

If you or someone you know is interested in participating in this, I highly recommend checking out “Take It Down” itself or StopNCII. As they state themselves, “having nudes online is scary, but there is hope to get it taken down.”

 

ajones11@ramapo.edu

Featured photo by Lydia Fries