The Supreme Court appeared skeptical of a pair of laws in Texas and Florida restricting how social media companies and others hosting user-generated content can moderate their sites.
On Monday, Supreme Court justices expressed doubt about laws from Texas and Florida aimed at regulating large social media companies' content moderation practices. During the arguments, which lasted nearly four hours, some justices hinted that these laws might breach the First Amendment by limiting platforms like Facebook and YouTube's ability to moderate their content.
The cases in question, NetChoice v Paxton and Moody v NetChoice, revolve around the immunity granted to tech companies under Section 230.
The Cases
NetChoice v Paxton is a case where NetChoice, representing a coalition of social media companies and internet platforms, argued that a Texas law (HB 20) regulating how and when social media companies can moderate their content and requiring them to give transparent explanations when they do so, violates their free speech rights. The Fifth Circuit Court of Appeals in Texas overturned a preliminary injunction that had prevented the implementation of this bill.
Moody v NetChoice, on the other hand, involves a similar Florida law (SB 7072) that also restricts how social media platforms can moderate content. The Eleventh Circuit affirmed in part, and vacated in part, the Florida injunction. The issues at hand are whether the laws’ content-moderation restrictions and individualized-explanation requirements comply with the First Amendment.
HB20 and SB 7072 were enacted after the Jan. 6, 2021, Capitol attacks, driven by claims of user censorship, particularly of conservative viewpoints, and require platforms to explain their content moderation decisions.
The Controversy of Section 230
Section 230 of the Communications Decency Act of 1996 generally protects tech and media companies from legal liability for publications made by third-party users. It states that “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider”.
Section 230 embodies that principle that we should all be responsible for our own actions and statements online, but generally not those of others. The law prevents most civil suits against users or services that are based on what others say.
Supporters of Section 230 are concerned that placing a legal obligation on websites to moderate content posted by third parties would create an overburdensome duty that would hinder free speech and innovation. But opponents are concerned that Section 230 enables websites to turn a blind eye to misconduct occurring on their platforms, resulting in content moderation policies that are applied inconsistently.
The Impact on Tech
The outcome of these cases could have far-reaching implications for the tech industry. If the Supreme Court rules in favor of the states, it could significantly limit the immunity currently enjoyed by tech companies under Section 230. This could potentially change the way social media platforms operate, forcing them to alter their content moderation policies.
Supporters of Section 230 argue that placing a legal obligation on websites to moderate content posted by third parties would create an overburdensome duty that would hinder free speech and innovation. However, opponents believe that Section 230 enables websites to turn a blind eye to misconduct occurring on their platforms, resulting in content moderation policies that are applied inconsistently.
Trade groups representing social media platforms have challenged the laws, leading to mixed decisions at the appellate level; the Texas law was upheld by the 5th Circuit but is not in effect, while the 11th Circuit blocked most of the Florida law. Arguments in court highlighted the debate over whether social media platforms' content moderation constitutes protected speech under the First Amendment and whether these laws unduly restrict platforms' editorial control.
Attorneys representing the states argued the laws are modest efforts to regulate platforms' power, while representatives for the tech groups and the U.S. Solicitor General defended the platforms' rights to editorial discretion, likening them to newspaper editors and parade sponsors.
Justices expressed concern about the laws' broad application, potentially affecting not just social media feeds but also other services like Gmail, Uber, and Etsy. They also debated the interaction between these laws and Section 230 of the Communications Decency Act, which shields tech companies from liability for user-generated content.
Justice Elena Kagan, for instance, directly questioned the constitutionality of the laws, particularly focusing on their impact on platforms like Facebook, YouTube, and X (formerly Twitter), and posed the critical question of why these regulations shouldn't be seen as a direct violation of the First Amendment, given they restrict the platforms from making their own editorial decisions. Justice Kagan's inquiry underscored the potential for these laws to interfere with platforms' rights to manage content as they see fit, a core component of free speech protections.
Justice Brett Kavanaugh and Chief Justice John Roberts both emphasized the fundamental principle that the First Amendment protects against governmental suppression of speech, suggesting a skepticism towards the laws' implications for social media companies' editorial freedoms. Kavanaugh, in particular, highlighted the longstanding protection of editorial control under the First Amendment, suggesting that the laws might overstep by infringing on this protected domain.
On the other hand, Justice Clarence Thomas and Justice Samuel Alito appeared more open to the arguments supporting the laws. Thomas challenged the notion of protected rights to censor content, posing a critical question about the precedent for the First Amendment protecting a right to exclude certain speech. Alito, similarly, questioned whether content moderation by these platforms could be seen as anything more than a euphemism for censorship, indicating a potential openness to the states' regulatory efforts.
The Court's decision, expected by summer, will clarify the extent to which states can regulate social media platforms and how such regulations interact with federal law and constitutional protections for speech.
ByteDance has appointed John Rogovin as its new Global General Counsel, bringing extensive legal experience from his previous role at Warner Bros. Discovery Inc.
In-house legal professionals talk about how best to navigate changes in salary with the expectations of their job.
Coinbase requested clear guidelines on the regulation of digital assets, the agency responded that the current laws and regulations are adequate.
Legal Operations professionals talk about what they use to manage and analyze large volumes of legal data.
A template for outside law firm legal billing requirements, shared by an in-house member of the Legal.io community.
It is not uncommon for many families and business people to bring personal assistants with them while traveling to the United States.
Law school applications are on the rise in 2024, with nearly a 6% increase compared to last year.
As the industry-leading legal and talent marketplace, we're passionate about helping you excel at every stage of your legal career. That's why our team has handpicked the top 5 career development sessions you absolutely cannot afford to miss at this year's CLOC conference!
An analysis of the latest financial metrics from the Am Law 100 rankings, detailing revenue, profitability, and growth trends within the legal industry's top firms.