The Supreme Court Debates the Future of Online Speech

The Supreme Court appeared skeptical of a pair of laws in Texas and Florida restricting how social media companies and others hosting user-generated content can moderate their sites.

The Supreme Court Debates the Future of Online Speech

On Monday, Supreme Court justices expressed doubt about laws from Texas and Florida aimed at regulating large social media companies' content moderation practices. During the arguments, which lasted nearly four hours, some justices hinted that these laws might breach the First Amendment by limiting platforms like Facebook and YouTube's ability to moderate their content. 

The cases in question, NetChoice v Paxton and Moody v NetChoice, revolve around the immunity granted to tech companies under Section 230.

The Cases

NetChoice v Paxton is a case where NetChoice, representing a coalition of social media companies and internet platforms, argued that a Texas law (HB 20) regulating how and when social media companies can moderate their content and requiring them to give transparent explanations when they do so, violates their free speech rights. The Fifth Circuit Court of Appeals in Texas overturned a preliminary injunction that had prevented the implementation of this bill.

Moody v NetChoice, on the other hand, involves a similar Florida law (SB 7072) that also restricts how social media platforms can moderate content. The Eleventh Circuit affirmed in part, and vacated in part, the Florida injunction. The issues at hand are whether the laws’ content-moderation restrictions and individualized-explanation requirements comply with the First Amendment.

HB20 and SB 7072 were enacted after the Jan. 6, 2021, Capitol attacks, driven by claims of user censorship, particularly of conservative viewpoints, and require platforms to explain their content moderation decisions.

The Controversy of Section 230

Section 230 of the Communications Decency Act of 1996 generally protects tech and media companies from legal liability for publications made by third-party users. It states that “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider”.

Section 230 embodies that principle that we should all be responsible for our own actions and statements online, but generally not those of others. The law prevents most civil suits against users or services that are based on what others say. 

Supporters of Section 230 are concerned that placing a legal obligation on websites to moderate content posted by third parties would create an overburdensome duty that would hinder free speech and innovation. But opponents are concerned that Section 230 enables websites to turn a blind eye to misconduct occurring on their platforms, resulting in content moderation policies that are applied inconsistently.

The Impact on Tech

The outcome of these cases could have far-reaching implications for the tech industry. If the Supreme Court rules in favor of the states, it could significantly limit the immunity currently enjoyed by tech companies under Section 230. This could potentially change the way social media platforms operate, forcing them to alter their content moderation policies.

Supporters of Section 230 argue that placing a legal obligation on websites to moderate content posted by third parties would create an overburdensome duty that would hinder free speech and innovation. However, opponents believe that Section 230 enables websites to turn a blind eye to misconduct occurring on their platforms, resulting in content moderation policies that are applied inconsistently.

Trade groups representing social media platforms have challenged the laws, leading to mixed decisions at the appellate level; the Texas law was upheld by the 5th Circuit but is not in effect, while the 11th Circuit blocked most of the Florida law. Arguments in court highlighted the debate over whether social media platforms' content moderation constitutes protected speech under the First Amendment and whether these laws unduly restrict platforms' editorial control.

Attorneys representing the states argued the laws are modest efforts to regulate platforms' power, while representatives for the tech groups and the U.S. Solicitor General defended the platforms' rights to editorial discretion, likening them to newspaper editors and parade sponsors.

Justices expressed concern about the laws' broad application, potentially affecting not just social media feeds but also other services like Gmail, Uber, and Etsy. They also debated the interaction between these laws and Section 230 of the Communications Decency Act, which shields tech companies from liability for user-generated content.

Justice Elena Kagan, for instance, directly questioned the constitutionality of the laws, particularly focusing on their impact on platforms like Facebook, YouTube, and X (formerly Twitter), and posed the critical question of why these regulations shouldn't be seen as a direct violation of the First Amendment, given they restrict the platforms from making their own editorial decisions. Justice Kagan's inquiry underscored the potential for these laws to interfere with platforms' rights to manage content as they see fit, a core component of free speech protections.

Justice Brett Kavanaugh and Chief Justice John Roberts both emphasized the fundamental principle that the First Amendment protects against governmental suppression of speech, suggesting a skepticism towards the laws' implications for social media companies' editorial freedoms. Kavanaugh, in particular, highlighted the longstanding protection of editorial control under the First Amendment, suggesting that the laws might overstep by infringing on this protected domain.

On the other hand, Justice Clarence Thomas and Justice Samuel Alito appeared more open to the arguments supporting the laws. Thomas challenged the notion of protected rights to censor content, posing a critical question about the precedent for the First Amendment protecting a right to exclude certain speech. Alito, similarly, questioned whether content moderation by these platforms could be seen as anything more than a euphemism for censorship, indicating a potential openness to the states' regulatory efforts.

The Court's decision, expected by summer, will clarify the extent to which states can regulate social media platforms and how such regulations interact with federal law and constitutional protections for speech.

Legal.io Logo
Welcome to Legal.io

Connect with peers, level up skills, and find jobs at the world's best in-house legal departments

Legal.io Logo
Welcome to Legal.io

Connect with peers, level up your skills, and find jobs at the world's best in-house legal departments