Social media regulations may loosen pending Supreme Court decision

Over the last decade and a half, social media has completely torn down and rebuilt the ways in which we connect and interact with each other. Soon, however, it’s possible platforms and their regulatory practices could open a new chapter after time on Capitol Hill.

The Supreme Court will be overseeing cases this year that specifically use Section 230 as a precedent for debate. Part of the Communications Decency Act, which was passed in 1996, the section allows web operators to moderate speech and content on their platforms as they see fit.

The bill was introduced to prevent minors from gaining access to sexually explicit content on the internet. Section 230 helps moderate such content and limit access to those in certain demographics.

“The last thing we need is for content to be micromanaged,” said senior business major Brad Kessler in an interview. “I honestly don’t know what kind of rules could stem from anything in Washington that would make a difference.”

There are currently multiple court cases threatening to undermine Section 230, Twitter v. Taamneh, an anti-terrorism case that started with Jordanian citizen Nawras Alassaf’s family suing Twitter and Facebook after Alassaf died in an ISIS-affiliated attack in Istanbul.

Another anti-terrorism case is Gonzalez v. Google. American Nohemi Gonzalez’s family sued Google after Gonzalez had been killed in a November 2015 ISIS attack in Paris, claiming that Google had been responsible for YouTube recommending users to the terrorist group’s recruitment videos.

Google released a statement on what would possibly happen if Section 230 regulations were lifted. “The internet would devolve into a disorganized mess and a litigation minefield.”

As if Elon Musk isn’t in the spotlight enough, he’s extremely significant in the Supreme Court’s decision that could come within the next year. Musk’s “the bird is free” claim after buying Twitter can be foreshadowing something more than free speech on the app. The app is rewarding any account blue check mark verification for an $8 subscription.

Uplifting Section 230 would only make things worse for Twitter, with either too much control over accounts and content as Musk pleases — which questions legality — or not enough at all, which would be just as bad, if not worse, than closely monitoring every user.

Musk’s big-time purchase would become a hellhole of harmful, uninvited content from hostile or unbothered users.

To add, from a productivity standpoint, not only would social media platforms lose their current users if they couldn’t guarantee safety and security on their sites, but would lose the newer, younger demographic that has yet to explore the world of popular internet trends and apps.

Especially in my career path, journalism would be an industry hit the hardest by this deregulation. Without protection or conclusive validation of sources, the news side of TikTok, Twitter and Instagram would be catastrophic, lighting a crucial side of modern-day information cycles to flames.

If anything, there should be more regulation than before on the content that could be accessible on different platforms. AIs and bots have already started to take popularity on some platforms and could be well-managed. Without constant monitoring, however, content could be more deceiving than ever before.

Section 230 has been in law for almost three decades, and while it may raise questions at times, it allows moderators to pick and choose what’s safe or not. Without this legislation, mayhem would ensue, and a barrage of deceiving, antagonistic material would be at our fingertips.

Let’s hope the mighty nine in robes make the right choice.

 

alamatt1@ramapo.edu

Photo courtesy of Laura Stanley, Pexels