Regulatory Policy

Should Social-Media Companies Be Considered ‘Common Carriers’?

(Photo Illustration: Dado Ruvic/Reuters)
Despite legitimate frustrations and complaints about content moderation online, Justice Thomas’s heavy-handed regulatory prescription is not the answer.

Supreme Court Justice Clarence Thomas’s concurrence in Joseph Biden v. Knight First Amendment at Columbia University recommends regulating social-media platforms as if they were common carriers or public accommodations, so that their First Amendment rights to exclude speech would be curtailed. While the U.S. does have a history of employing such regulations, the wisdom of those precedents and the extent to which they are appropriate for social-media companies is less certain.

Section 230 of the 1996 Communications Decency Act is the current controlling federal law. It provides broad immunity for information service providers against being held liable for the content of third-party speech on their platforms and for the removal or minimizing of that content.

Individuals on both sides of the political spectrum are unhappy with Section 230’s often imperfect content-moderation results. Many on the left want more content taken down that they view as harmful, while many on the right claim that too much of their content is removed for politically biased reasons.

But even if current congressional efforts to repeal Section 230 are successful, online platforms would still have their First Amendment rights intact to remove speech they didn’t want to carry. That’s precisely why Thomas takes aim at lessening Facebook, Twitter, and similar platforms’ claims to such protections.

In his concurrence, Justice Thomas lays out how to regulate social-media platforms as common carriers or public accommodations in order to restrict the platforms’ rights of exclusion. The new regime would be something akin to digital forced access; social-media companies’ First Amendment right to regulate or remove speech on their private platforms would be eliminated or curtailed.

From a legal perspective, however, the precedents for imposing speech restrictions on private entities may not easily translate to the way that social-media companies operate, or the markets that they serve.

In Pruneyard Shopping Center v. Robins, the Supreme Court held that a shopping mall could be forced to allow distribution of leaflets and gathering of signatures within the building. The case does suggest some parallels for preventing social-media platforms from restricting speech. But defenders of private property correctly dislike this ruling and those on the right could easily object, on similar bedrock property-rights arguments, to private platforms being forced to host speech against their will. Turner Broadcasting, Inc. v. FCC is precedent for common-carrier status triggering obligations to carry speech. That was in the form of obligating cable companies to carry local broadcast stations and, interestingly, Justice Thomas would go on to join the dissent when the questions were revisited in 1997. But unlike cable franchises, social-media platforms have made no secret of reserving their right to deny service to any user under their terms of service. After all, if Twitter and Facebook cannot exercise “editorial” discretion over the posts on their platforms, how will they ever attract advertisers, who understandably don’t want their brand popping up next to God knows what? Exclusion is central to their business model, which is an important distinction from former common carriers.

On the flip side, in Miami Herald v. Tornillo, the Supreme Court ruled against compelled speech, holding that the newspaper could not be forced to publish replies to criticism of candidates (regardless of its local market share). Similarly, in Hurley v. Irish-American Gay, Lesbian, & Bisexual Group of Boston, Inc., the Court protected parade organizers’ right to exclude participants. These last two examples seem to align more with the speech and curation aspects that are fundamental to social-media platforms.

To justify government regulation, Thomas mistakenly claims that there is insufficient competition in the social-media space. He writes, “That these companies have no comparable competitors highlights that the industries may have substantial barriers to entry.” In reality, these platforms are constantly having to compete with new market entries. Examples include Snapchat, Clubhouse, TikTok, and many more. The next generation of social media, much of which has yet to be invented, will likely be decentralized and even less akin to entities that have been regulated as common carriers in the past. Beyond comparable social-media-platform alternatives to Twitter and Facebook (both of which have banned former President Trump) there still exists television, radio, and the rest of the Internet.

The practical reality of forcing social-media companies to carry speech to which they object might not be the panacea that conservative critics imagine. Much of what is culled from these platforms is spam, extreme hate speech, and disturbing (but constitutionally protected) content. An Internet with no moderation would quickly become a place that very few people would want to visit. Facebook would more aptly be called “Pornbook” in short time.

While there are legitimate frustrations and complaints about content moderation online, Thomas’s heavy-handed regulatory prescription is not the answer. The unintended consequences of common-carrier regulation warn against the idea, and the marketplace is already at work decentralizing control. Most fundamentally, despite the opinion of one Supreme Court justice, case law does not suggest that laws restricting the First Amendment rights of social-media platforms would survive judicial review.

Jessica Melugin is the associate director of the Center for Technology and Innovation at the Competitive Enterprise Institute.

Recommended

The Latest