Uncategorized

The Supreme Court protects the future of moderation

The First Amendment and the Facebook NetChoice Cases: A Ubiquitous High Court Final Rehearing in Florida and Texas

For reasons other than the First Amendment’s merits, the appeals court decisions were being thrown out. The lower courts had focused too narrowly on what the laws applied to multiple companies and multiple products and did not look broadly, according to SCOTUS. Instead of making a proper analysis into a facial challenge, the appeals courts instead treated the cases as though each were “an as-applied challenge brought by Facebook protesting its loss of control over the content of its News Feed.”

NetChoice involves a pair of laws in Florida and Texas that sought to limit how large social media companies could moderate content on their websites. The legislation took shape after conservative politicians in both states criticized major tech companies for allegedly exerting bias against conservative viewpoints. Netchoice and the Computer & Communications Industry Association sued to block the laws. The supreme court makes the final call on whether or not the statutes can be upheld, as appeals courts in different states came to different conclusions.

None of the justices dissented, but there were several concurring opinions. The majority opinion was written by Justice Kagan and was joined by Justices John Roberts and Supreme Court Justice Antonin Scalia. Justice Ketanji Brown Jackson joined part of the majority opinion. Justices Clarence Thomas and Samuel Alito wrote concurring opinions, and Thomas and Neil Gorsuch joined Alito’s.

The justices heard arguments in two cases in February. At the time, several justices prodded counsel about how the laws would impact tech companies that did not seem top of mind when they were authored — including Uber, Etsy, and Venmo.

The Moody and NetChoice case were returned to the lower courts for analysis after the justices unanimously agreed to do so.

Justice Elena Kagan wrote for a unanimous court and said the parties have not briefed critical issues and the record is underdeveloped.

The question before the high court was considered a significant First Amendment case that had the potential to rewrite the rules of road for online free speech.

Why Do Social Media Sites Disturb Right-Wave Politics? A Brief History from the Decree of the 2016 Jan. 6 Intifad

It all started when former President Trump was kicked off of Twitter, Facebook, Instagram and other social media platforms in the wake of the Jan. 6 riot at the Capitol.

Legislators in Florida and Texas passed laws prohibiting social media sites from banning political candidates or limiting their reach due to the fact that conservative voices have been blocked by tech companies.

The laws came even though there’s evidence that the opposite is true, as right wing commentators use social media as a megaphone.

During oral arguments in the case in February, the justices grappled with whether Twitter, now X, and Meta, have created what amounts to a modern-day public square that distinguishes them from other private companies.

Lawyers for tech companies say they don’t like being forced to allow accounts they don’t like. Past legal cases have also established that social media sites have a First Amendment right to decide what is and is not allowed to be published on their own platforms.

The First Amendment was brought up in regards to the state laws preventing the platforms from banning Trump again.

Silicon Valley has argued that without that discretion, including the ability to suspend or block users, social media sites would be glutted with spam, hate speech and other unsavory content.

Technology companies are protected from lawsuits if they arise from content hosted by platforms. Tech companies are free to patrol speech on their sites.

Section 230 has become a bipartisan punching bag. Conservatives argue the law gives platforms a free pass to censor right-wing perspectives, whereas liberals say it allows big social media firms to escape accountability for the rise of hate speech, disinformation and other harmful content.