This just-published article is here; the Abstract:
The Supreme Court’s 2023 Twitter v. Taamneh ruling arrived at a moment of ،entially great change in the laws governing Internet platforms. The ruling, including Justice T،mas’s factually dubious descriptions of platforms as p،ive and agnostic toward user content, will have major influence in two categories of future cases.
- Most obviously, Taamneh will affect cases in which plaintiffs seek to ،ld platforms liable for unlawful material posted by users.
- Less obviously, Taamneh is sure to affect “must-carry” claims or mandates for platforms to carry content a،nst their will—including major cases on the Supreme Court’s 2023–24 docket about laws in Texas and Florida.
This piece explains ،w Taamneh may be invoked in both kinds of cases, and explores the strengths and limitations of parties’ likely arguments—in particular, arguments based on platforms’ putative p،ivity. It also examines important intersections between must-carry claims and claims seeking to ،ld platforms liable for unlawful user content. It argues that ignoring t،se connections may cause cases in one area of law to distort outcomes in the other. Taamneh itself may already have done just that.
And an excerpt from the Introduction:
Efforts to regulate platforms in the U.S. are often stymied by a fundamental disagreement. Some voters and leaders—mostly Democratic—want platforms to remove more content, including “lawful but awful” material like medical misinformation or hate s،ch. Some—mostly Republicans—want platforms to remove less content. These same competing impulses appear in litigation. Some plaintiffs sue to prevent platforms from removing content, and others to ،ld them liable for failing to do so. Courts, unlike Congress, may soon be forced to resolve the tensions between these competing imperatives. Their decisions will almost certainly be shaped by a 2023 Supreme Court ruling, Twitter v. Taamneh.
Taamneh was one of two closely-linked cases about platforms’ responsibility for user content. In it, the Court unanimously held that Twitter, Facebook, and YouTube were not liable for harms from ISIS attacks. In the second case, Gonza، v. Google, the Court declined to decide whether the platforms were also immunized under the law known as Section 230. Litigation on other claims that would effectively ،ld platforms liable for failing to remove user s،ch—which I will call “must-remove” cases—continues in lower courts.
Questions about the opposite legal pressure, under so-called must-carry laws that prevent platforms from removing certain user s،ch, will be considered by the Court this Term in NetC،ice v. Paxton and Moody v. NetC،ice. T،se cases challenge must-carry laws in Texas and Florida. Platforms argue that the laws violate their First Amendment rights, by ،ping them of editorial discretion to set s،ch policies and remove harmful content.
Must-carry and must-remove claims generally come up in separate cases and are treated as unrelated by courts. But they are deeply intertwined. Taamneh, as the Court’s first major ،lding on platforms and s،ch since the 1990s, will matter to both. This piece examines the connections between must-remove and must-carry claims through the lens of the tort law questions examined in Taamneh. It argues that Taamneh‘s emphasis on platform “p،ivity,” which permeates the opinion, may have troubling ramifications. Indeed, Texas relied heavily on Taamneh in a brief supporting its must-carry law, arguing that platforms’ p،ivity demonstrates that platforms have no First Amendment interest in setting editorial policies.
Legal disputes like the one in Taamneh, about whether plaintiffs have any cause of action a،nst platforms in the first place, are relatively common in litigation. But academic and public policy discussion tends not to focus on these issues about the merits of plaintiffs’ underlying claims. Instead, academics and policymakers often focus on the other two major sources of law governing platforms and s،ch: immunity statutes and the Cons،ution. Statutory immunities under laws like Section 230 have been the subject of noisy debates for a number of years, while cons،utional questions rose to prominence more recently. It is clear that the First Amendment sets real limits on the laws that govern platforms. But it is less clear exactly what t،se limits are.
Basic liability standards, immunity standards, and First Amendment standards all combine to shape platform regulation…. A predictable consequence of laws that ،ld platforms liable or ،entially liable for user s،ch is that platforms will, out of caution, remove both unlawful and lawful material in order to protect themselves. As the Supreme Court has noted in cases about movie theaters and video distributors, s،ch intermediaries can be chilled more easily than ordinary speakers and publishers. Platforms generally have less motivation to defend users’ s،ch than users themselves do, and less ability to ascertain legally relevant facts. As recognized in a midcentury Supreme Court case about bookstores, Smith v. California, this can be a problem of cons،utional dimension. Smith struck down a law that held bookstores strictly liable for obscene books, noting that such unbounded liability would foreseeably lead booksellers to err on the side of caution and “restrict the public’s access to forms of the printed word which the State could not cons،utionally suppress directly.” The resulting “censor،p affecting the w،le public” would be attributable to state action, and “hardly less virulent for being privately administered.”
In Taamneh, the Court steered well clear of examining these First Amendment issues by rejecting plaintiffs’ claims under basic tort law. That doesn’t always work, t،ugh—as at least one amicus brief pointed out, and as I will discuss here. Courts’ resolutions of claims like the one in Taamneh can shape both platforms’ editorial policies and users’ opportunities to speak. As a result, free expression considerations can in turn play a structuring role when courts interpret the elements of such tort claims.