Jurisprudence

The First Amendment Just Dodged an Enormous Bullet at the Supreme Court

John Roberts smiling, the text of Netchoice, Elena Kagan frowning, the X and Facebook logos.
To Kagan, social media companies in moderating content were just like newspapers. Photo illustration by Slate. Photos by Shawn Thew/Pool/AFP via Getty Images and supremecourt.gov.

At Supreme Court oral argument in the Texas social media case back in February, Justice Samuel Alito asked the question: “Let’s say YouTube were a newspaper, how much would it weigh?” In Monday’s Supreme Court opinion in Moody v. NetChoice, a five-justice majority over Alito’s objection did not directly answer that absurd question, but it did say that under the First Amendment, Facebook should get about the same amount of editorial discretion as the Miami Herald. And that’s some good news from an otherwise bleak end of the Supreme Court term.

As I first explained at Slate back in February, Moody concerns a pair of laws, one from Florida and one from Texas, that regulate different aspects of social media. Among the most important provisions in each of the laws is a limit on content moderation. The state laws differ in their particulars, but both were motivated by the removal of Donald Trump from major social media platforms after the violence of Jan. 6, 2021, and a general complaint that the platforms were unfairly “censoring” conservative voices on their platforms. A coalition of internet companies, NetChoice, challenged the laws claiming they violated the First Amendment. The U.S. Court of Appeals for the 11th Circuit put key parts of the Florida law on hold but the 5th Circuit allowed Texas’ law to go forward pending further proceedings—an action the Supreme Court reversed as it considered these issues.

In Monday’s opinion in Moody, the Supreme Court was unanimous in holding that the way that NetChoice litigated its cases was not proper. It had brought a “facial” challenge to the law under the First Amendment, which essentially requires showing that in almost any way that the state might try to enforce its law, doing so would be unconstitutional. The justices agreed that these laws were very complex and the issues were not fully developed. As Justice Elena Kagan explained for the majority, the cases were litigated as if it was just about whether Facebook could curate its news feed. But it was not clear how this law might apply to Gmail, or Etsy, or Venmo. The case is going back for better legal and factual development to both courts.

That’s where the agreement among the justices ended. Speaking for herself, Chief Justice John Roberts, and Justices Amy Coney Barrett, Brett Kavanaugh, and Sonia Sotomayor, Kagan gave guidance on where the 5th Circuit went wrong in its First Amendment analysis in considering the constitutionality of the Texas content moderation decisions. None of this was necessary for the decision (in legal parlance, it was “dicta”), but the court addressed the issue because “[i]f we said nothing about those views, the court presumably would repeat them when it next considers NetChoice’s challenge.” The other justices would not have reached the First Amendment merits, although Alito expressed some serious reservations about the analysis.

Kagan’s guidance relied heavily on a 1974 case, Miami Herald v. Tornillo, in which the court held unconstitutional a Florida law that required newspapers to print the reply of someone who had been criticized in the newspaper. The court held that private actors like newspapers have every right under the First Amendment to include or exclude content as they see fit.

To Kagan, social media companies in moderating content were just like newspapers. She said that curating content is expressive activity protected by the First Amendment and that includes the decision to exclude content and that this principle is true even if most content is allowed and just a little bit is excluded. Further, when it comes to laws regulating speech, “the government cannot get its way just by asserting an interest in improving, or better balancing, the marketplace of ideas.” Were the rule otherwise, Kagan asserted, the platforms could be forced by Texas law to carry bad content including posts that “support Nazi ideology; advocate for terrorism; espouse racism, Islamophobia, or anti-Semitism; glorify rape or other gender-based violence; encourage teenage suicide and self-injury; discourage the use of vaccines; advise phony treatments for diseases; [and] advance false claims of election fraud.”

Moody might seem like an unremarkable decision, consistent with long-standing First Amendment principles. And indeed, in an amicus brief in the cases that I filed with political scientist Brendan Nyhan and journalism professor Amy Wilentz and co-authored with Nat Bach and his team at Manatt Phelps, we argued that Tornillo is the right analogy.

But in endorsing this view of the First Amendment, the majority brushed aside a major argument made by Justice Clarence Thomas in earlier cases and by First Amendment scholar Eugene Volokh that social media companies should be treated differently because they function like “common carriers,” such as the phone company. Just like Verizon cannot deny you a phone because of what you might say using it, the argument is that Facebook had to be open to everyone’s view.

The court gives the argument the back of its hand, never even addressing it directly; Alito says the majority “brushes aside the argument without adequate consideration.” Thomas says the argument should still be pursued in the lower courts, but it’s squarely inconsistent with what the Kagan majority says in its dicta. Volokh too sees many unanswered questions and thinks there is still a chance for some parts of these laws to be upheld when the cases get back to the lower court.

Beating back this common carrier argument, at least for now, is a victory for democracy. Had Texas won, as we argued in our amicus brief, a state would be free to require a social media company to carry not just election denialist speech but speech that foments political violence.

Social media companies are in many ways not like newspapers in terms of volume and reach and the ability of everyone to share their own views and opinions. But when social media companies regulate content, they can act as responsible corporate citizens and remove speech they find objectionable. We know from Elon Musk’s takeover of Twitter (now X) that not all platforms will act responsibly. That’s a pity, but as the Supreme court said Monday, it’s a choice for the platforms to make.

This is part of Opinionpalooza, Slate’s coverage of the major decisions from the Supreme Court this June. Alongside Amicus, we kicked things off this year by explaining How Originalism Ate the Law. The best way to support our work is by joining Slate Plus. (If you are already a member, consider a donation or merch!)