The US Supreme Court today heard oral arguments on Florida and Texas state laws that impose limits on how social media companies can moderate user-generated content.
The Florida law prohibits large social media sites like Facebook and Twitter (aka X) from banning politicians, and says they must “apply censorship, deplatforming, and shadow banning standards in a consistent manner among its users on the platform.” The Texas statute prohibits large social media companies from moderating posts based on a user’s “viewpoint.” The laws were supported by Republican officials from 20 other states.
The tech industry says both laws violate the companies’ First Amendment right to use editorial discretion in deciding what kinds of user-generated content to allow on their platforms, and how to present that content. The Supreme Court will decide whether the laws can be enforced while the industry lawsuits against Florida and Texas continue in lower courts.
How the Supreme Court rules at this stage in these two cases could give one side or the other a big advantage in the ongoing litigations. Paul Clement, a lawyer for Big Tech trade group NetChoice, today urged justices to reject the idea that content moderation conducted by private companies is censorship.
“I really do think that censorship is only something that the government can do to you,” Clement said. “And if it’s not the government, you really shouldn’t label it ‘censorship.’ It’s just a category mistake.”
Companies use editorial discretion to make websites useful for users and advertisers, he said, arguing that content moderation is an expressive activity protected by the First Amendment.
Justice Kagan talks anti-vaxxers, insurrectionists
Henry Whitaker, Florida’s solicitor general, said that social media platforms marketed themselves as neutral forums for free speech but now claim to be “editors of their users’ speech, rather like a newspaper.”
“They contend that they possess a broad First Amendment right to censor anything they host on their sites, even when doing so contradicts their own representations to consumers,” he said. Social media platforms should not be allowed to censor speech any more than phone companies are allowed to, he argued.
Contending that social networks don’t really act as editors, he said that “it is a strange kind of editor that does not actually look at the material” before it is posted. He also said that “upwards of 99 percent of what goes on the platforms is basically passed through without review.”
Justice Elena Kagan replied, “But that 1 percent seems to have gotten some people extremely angry.” Describing the platforms’ moderation practices, she said the 1 percent of content that is moderated is “like, ‘we don’t want anti-vaxxers on our site or we don’t want insurrectionists on our site.’ I mean, that’s what motivated these laws, isn’t it? And that’s what’s getting people upset about them is that other people have different views about what it means to provide misinformation as to voting and things like that.”
Later, Kagan said, “I’m taking as a given that YouTube or Facebook or whatever has expressive views. There are particular kinds of expression defined by content that they don’t want anywhere near their site.”
Pointing to moderation of hate speech, bullying, and misinformation about voting and public health, Kagan asked, “Why isn’t that a classic First Amendment violation for the state to come in and say, ‘we’re not going to allow you to enforce those sorts of restrictions?'”
Whitaker urged Kagan to “look at the objective activity being regulated, namely censoring and deplatforming, and ask whether that expresses a message. Because they [the social networks] host so much content, an objective observer is not going to readily attribute any particular piece of content that appears on their site to some decision to either refrain from or to censor or deplatform.”
Thomas: Who speaks when an algorithm moderates?
Justice Clarence Thomas expressed doubts about whether content moderation conveys an editorial message. “Tell me again what the expressive conduct is that, for example, YouTube engages in when it or Twitter deplatforms someone. What is the expressive conduct and to whom is it being communicated?” Thomas asked.
Clement said the platforms “are sending a message to that person and to their broader audience that that material” isn’t allowed. As a result, users are “not going to see material that violates the terms of use. They’re not going to see a bunch of material that glorifies terrorism. They’re not going to see a bunch of material that glorifies suicide,” Clement said.
Thomas asked who is doing the “speaking” when an algorithm performs content moderation, particularly when “it’s a deep-learning algorithm which teaches itself and has very little human intervention.”
“So who’s speaking then, the algorithm or the person?” Thomas asked.
Clement said that Facebook and YouTube are “speaking, because they’re the ones that are using these devices to run their editorial discretion across these massive volumes.” The need to use algorithms to automate moderation demonstrates “the volume of material on these sites, which just shows you the volume of editorial discretion,” he said.
Sotomayor has “a problem with laws like this”
Justice Brett Kavanaugh questioned Whitaker’s view of whether the First Amendment targets only government-imposed restrictions on speech. “In your opening remarks you said, ‘the design of the First Amendment is to prevent suppression of speech.’ And you left out what I understand to be three key words in the First Amendment or to describe the First Amendment: ‘By the government.’ Do you agree ‘by the government’ is what the First Amendment is targeting?”
Whitaker replied, “I do agree with that, but I don’t agree that there is no First Amendment interest in allowing the people’s representatives to promote the free exchange of ideas.”
The Florida and Texas cases were argued separately today. Although the cases raise similar constitutional questions, the laws are a bit different. “I have a problem with laws like this that are so broad that they stifle speech just on their face,” Justice Sonia Sotomayor said during the Texas arguments.
Aaron Nielson, the Texas solicitor general, said that social networks are the “modern public square” and that the Texas law “is a modest effort to regulate such power in the context of viewpoint discrimination.”
Facial challenges make things complicated
One possible roadblock for the tech industry is that each case involves a facial challenge. When successful, a facial challenge strikes down a law in its entirety rather than narrowing its scope. “It comes to the Court on a facial challenge, which means that the only question before the Court is whether the statute has a plainly legitimate sweep,” Whitaker said during the Florida arguments.
Justice Amy Coney Barrett said that Florida’s law “is very broad” and appears to cover more than just social networks. It could cover Uber, Google search, and Amazon, she said.
“Don’t we have to consider these questions Justice Alito is raising about DMs [direct messages] and Uber and Etsy because we have to look at the statute as a whole? And, I mean, we don’t have a lot of briefing on this, and this is a sprawling statute and it makes me a little bit nervous,” she said.
Barrett asked Clement how the court can strike down the Florida law’s effect on social networks without nullifying parts of the law that impose restrictions on other types of services, like email and direct messaging.
“Let’s assume that I agree with you about YouTube and Facebook feeds, news feeds, but that I don’t want to say that Facebook Marketplace or Gmail or DMs are not within the statute’s plainly legitimate sweep,” Barrett said.
Clement acknowledged that “I’m not sure you could reach that result without definitively holding that that stuff is within the plainly legitimate sweep of the statute.” But he went on to suggest that the court could “affirm the preliminary injunction, and then you would perhaps lament the fact that the record here is somewhat stunted, and then you would make clear that there might be a possibility to modify the preliminary injunction on remand.”
Biden admin opposes Florida and Texas laws
The Biden administration has weighed in against the Florida and Texas laws, with Solicitor General Elizabeth Prelogar representing the US government during today’s arguments. Social media platforms “shape and present collections of content on their websites, and that inherently expressive activity is protected by the First Amendment,” Prelogar said.
“These platforms are private parties. They’re not bound by the First Amendment,” Preloger said.
The conduct of social media companies can be regulated in other ways, but “governments have to stay within the bounds of the First Amendment,” she said. “And these state laws which restrict the speech of the platforms to enhance the relative voice of certain users don’t withstand constitutional scrutiny.”