The Supreme Court appeared skeptical of a pair of laws in Texas and Florida restricting how social media companies and others hosting user-generated content can moderate their sites.
On Monday, Supreme Court justices expressed doubt about laws from Texas and Florida aimed at regulating large social media companies' content moderation practices. During the arguments, which lasted nearly four hours, some justices hinted that these laws might breach the First Amendment by limiting platforms like Facebook and YouTube's ability to moderate their content.
The cases in question, NetChoice v Paxton and Moody v NetChoice, revolve around the immunity granted to tech companies under Section 230.
The Cases
NetChoice v Paxton is a case where NetChoice, representing a coalition of social media companies and internet platforms, argued that a Texas law (HB 20) regulating how and when social media companies can moderate their content and requiring them to give transparent explanations when they do so, violates their free speech rights. The Fifth Circuit Court of Appeals in Texas overturned a preliminary injunction that had prevented the implementation of this bill.
Moody v NetChoice, on the other hand, involves a similar Florida law (SB 7072) that also restricts how social media platforms can moderate content. The Eleventh Circuit affirmed in part, and vacated in part, the Florida injunction. The issues at hand are whether the laws’ content-moderation restrictions and individualized-explanation requirements comply with the First Amendment.
HB20 and SB 7072 were enacted after the Jan. 6, 2021, Capitol attacks, driven by claims of user censorship, particularly of conservative viewpoints, and require platforms to explain their content moderation decisions.
The Controversy of Section 230
Section 230 of the Communications Decency Act of 1996 generally protects tech and media companies from legal liability for publications made by third-party users. It states that “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider”.
Section 230 embodies that principle that we should all be responsible for our own actions and statements online, but generally not those of others. The law prevents most civil suits against users or services that are based on what others say.
Supporters of Section 230 are concerned that placing a legal obligation on websites to moderate content posted by third parties would create an overburdensome duty that would hinder free speech and innovation. But opponents are concerned that Section 230 enables websites to turn a blind eye to misconduct occurring on their platforms, resulting in content moderation policies that are applied inconsistently.
The Impact on Tech
The outcome of these cases could have far-reaching implications for the tech industry. If the Supreme Court rules in favor of the states, it could significantly limit the immunity currently enjoyed by tech companies under Section 230. This could potentially change the way social media platforms operate, forcing them to alter their content moderation policies.
Supporters of Section 230 argue that placing a legal obligation on websites to moderate content posted by third parties would create an overburdensome duty that would hinder free speech and innovation. However, opponents believe that Section 230 enables websites to turn a blind eye to misconduct occurring on their platforms, resulting in content moderation policies that are applied inconsistently.
Trade groups representing social media platforms have challenged the laws, leading to mixed decisions at the appellate level; the Texas law was upheld by the 5th Circuit but is not in effect, while the 11th Circuit blocked most of the Florida law. Arguments in court highlighted the debate over whether social media platforms' content moderation constitutes protected speech under the First Amendment and whether these laws unduly restrict platforms' editorial control.
Attorneys representing the states argued the laws are modest efforts to regulate platforms' power, while representatives for the tech groups and the U.S. Solicitor General defended the platforms' rights to editorial discretion, likening them to newspaper editors and parade sponsors.
Justices expressed concern about the laws' broad application, potentially affecting not just social media feeds but also other services like Gmail, Uber, and Etsy. They also debated the interaction between these laws and Section 230 of the Communications Decency Act, which shields tech companies from liability for user-generated content.
Justice Elena Kagan, for instance, directly questioned the constitutionality of the laws, particularly focusing on their impact on platforms like Facebook, YouTube, and X (formerly Twitter), and posed the critical question of why these regulations shouldn't be seen as a direct violation of the First Amendment, given they restrict the platforms from making their own editorial decisions. Justice Kagan's inquiry underscored the potential for these laws to interfere with platforms' rights to manage content as they see fit, a core component of free speech protections.
Justice Brett Kavanaugh and Chief Justice John Roberts both emphasized the fundamental principle that the First Amendment protects against governmental suppression of speech, suggesting a skepticism towards the laws' implications for social media companies' editorial freedoms. Kavanaugh, in particular, highlighted the longstanding protection of editorial control under the First Amendment, suggesting that the laws might overstep by infringing on this protected domain.
On the other hand, Justice Clarence Thomas and Justice Samuel Alito appeared more open to the arguments supporting the laws. Thomas challenged the notion of protected rights to censor content, posing a critical question about the precedent for the First Amendment protecting a right to exclude certain speech. Alito, similarly, questioned whether content moderation by these platforms could be seen as anything more than a euphemism for censorship, indicating a potential openness to the states' regulatory efforts.
The Court's decision, expected by summer, will clarify the extent to which states can regulate social media platforms and how such regulations interact with federal law and constitutional protections for speech.
San Francisco, CA — Legal.io, the Silicon Valley-based hiring platform for flexible, in-house legal talent, is proud to announce that its membership has surpassed 50,000 legal professionals.
Aspiring lawyers have consistently rated their law school experience well in the past 20 years, with about 80% rating it as “good” or “excellent” on the annual Law School Survey of Student Engagement. Satisfaction rates among students of color were lower than among whites.
Ah, the team meeting – it’s one of those sacred office rituals, its contours familiar to workers across the country. We gather up our notepads, head to the breakout room, listen to people making a series of loosely connected points with only a vague relevance to our job, before everyone leaves wondering ‘What was the point in that?’ Yes, it’s fair to say that many meetings do not inspire employees, aid workflow, or get much done. Still, there are always going to be times when colleagues need to get together to talk to each other. How can you transform meetings from pointless and dull to productive and exciting? Let’s find out!
In-house legal professionals discuss how they balance parenthood with in-house or law firm roles.
In litigation, a party will often use a third-party subpoena to obtain documents from individuals, businesses, or other entities that are not parties to the lawsuit.
The Welsh novelist and political theorist Raymond Williams wrote that “Culture is one of the two or three most complicated words in the English language”. Indeed, the word has many different meanings, which are subtly different yet interconnected. So what, exactly, is a “company culture”? Well, company culture is the subject of a fascinating 2018 study by four academics writing in the Harvard Business Review, entitled ‘The Leader’s Guide to Corporate Culture’. In their work, Groysberg, Lee, Price and Cheng draw on a century or more of academic literature on cultural studies as they struggle to pin down what a ‘company culture’ is. They propose that a culture is something that is shared, pervasive, enduring, and implicit. In other words, a culture is a group phenomenon (it cannot exist in one person alone), which permeates the whole group, exists over the long term, and shapes people’s behavior without necessarily being aware of it. It guides the way the people of an organization behave through a set of “shared assumptions and group norms”.
Am Law 50 firms are outpacing the broader legal market in 2024, seeing higher demand and revenue growth.
Sidley Austin, a leading law firm, recently undertook a groundbreaking experiment to evaluate the efficacy of GPT-4, the latest generative AI model from OpenAI, in the realm of e-discovery and document review. This article delves into their findings, shedding light on both the advantages and limitations of employing GPT-4 for legal document review.
The funding round was led by StepStone Group, bringing Transcend's total funding to $90M to enhance its data privacy and compliance solutions.