The Supreme Court will hear two landmark cases regarding content moderation on social media.
The United States Supreme Court has released a series of orders following its September ‘long conference,’ a traditional meeting held at the end of the summer recess where the justices review petitions that have accumulated over the break. The court granted certiorari for 12 cases to be argued during the 2023–24 term, in January or February.
Among the cases to be considered are two landmark cases, Moody v. NetChoice and NetChoice v. Paxton, that challenge state laws restricting content moderation on social media websites.
Moody v. NetChoice was filed by NetChoice and the Computer & Communications Industry Association in the U.S. District Court for the Northern District of Florida on May 27, 2021. The lawsuit argues against content regulation of social-media providers in Florida. The Act in question infringes and facially violates the First Amendment rights of America’s leading businesses by compelling them to host even highly objectionable content that is not appropriate for all viewers, violates their terms of service, or conflicts with the companies’ policies and beliefs.
On October 24, 2022, NetChoice & co-plaintiff CCIA filed at the U.S. Supreme Court in NetChoice v. Moody.
NetChoice v. Paxton
NetChoice v. Paxton is a legal case challenging a Texas law that prohibits large social media platforms from “censoring” content based on viewpoint or location. NetChoice, a trade association representing Meta and TikTok, among others, sued Texas in September 2021 before the law took effect. The district court granted a preliminary injunction to stop the law, but the Fifth Circuit stayed the injunction in May 2022. The Supreme Court vacated the stay on May 31, 2022.
Implications of the Supreme Court Review
These cases are significant as they relate to protected speech under the First Amendment, content moderation by interactive service providers on the Internet under Section 230 of the Communications Decency Act, and two state laws passed in Florida and Texas that sought to limit this moderation.
The outcomes could potentially reshape Section 230 of the Communications Decency Act, which currently provides immunity to internet service providers from liability for content posted by users. A ruling against the social media platforms could lead to significant changes in how platforms moderate content.
The court’s decision will likely set a precedent for future legislation on digital rights and responsibilities. This could influence how states approach similar issues. Depending on the court’s decision, social media platforms may need to revise their content moderation policies. This could lead to significant changes in user experience and the nature of discourse on these platforms. The decision could also open the floodgates for further legal challenges, either from users who believe their speech has been unfairly restricted or from platforms defending their moderation policies.