Quantcast
Channel: Harvard Law Review Blog -
Viewing all articles
Browse latest Browse all 42

Courts Should Hold Social Media Accountable — But Not By Ignoring Federal Law

$
0
0

The recent Third Circuit case where the court withheld immunity for TikTok after a child died attempting a “Blackout Challenge” suggested on her “For You Page” was wrongly decided. But TikTok and other companies should not be absolved of responsibility for the foreseeable harms of their platforms. Rather than interpret away extant law, courts should recognize causes of action that accurately capture the ills of social media.

Venerated as “the twenty-six words that created the internet,” 47 U.S. Code § 230 is a federal law that immunizes platforms like TikTok, YouTube, or Facebook for the content people post there. Without Section 230, the logic runs, no platform would host user-generated content at scale for fear of being held responsible for it. Section 230 — which originally appeared in a law aimed at cleaning up the internet — also immunizes platforms for removing offensive content. The Supreme Court struck down the rest of the law, the Communications Decency Act, in 1997 on free speech grounds. Only Section 230 survived.

Not everyone “likes” Section 230. There are Democrats who blame the law for the prevalence of hate speech, non-consensual pornography, misinformation, and other harmful online content. Congress amended Section 230 in 2018 to exclude online sex trafficking, and it never applied to copyright violations or federal crimes. But social media gets a pass for ignoring everything else. Some conservatives think platforms are instead overpolicing content. These individuals apparently see Section 230 as cover for tech companies trying to censor conservative viewpoints under the guise of combatting misinformation and hate speech.

The law is not popular among partisans. But Section 230 could always count on the courts. Judges have interpreted Section 230’s immunity broadly from its inception. A few cases — Fair Housing Council of San Fernando Valley v. Roomates.com in the Ninth Circuit, for example, or FTC v. Accusearch, Inc. in the Tenth Circuit — have found platforms liable for soliciting problematic participation. Overwhelmingly, however, courts have interpreted Section 230 to foreclose civil or state criminal liability premised on user-generated or “third party” content.

A recent decision by the Court of Appeals for the Third Circuit balks this trend. Anderson v. TikTok, Inc. reverses a district court’s dismissal of a wrongful death lawsuit against TikTok on Section 230 grounds. The majority’s short opinion notes that editorial choices — including the algorithmic recommendation engine that matched ten-year-old Nylah Anderson with a “Blackout Challenge” video — constitute protected expression under the First Amendment following the Supreme Court’s July 2024 decision in Moody v. NetChoice LLC. If recommending content is TikTok’s “speech” for free speech purposes, the court reasons, then it is TikTok’s speech for purposes of Section 230 as well. The concurrence’s longer opinion reasons that distributing speech is different from merely hosting it. TikTok didn’t create the “Blackout Challenge,” but it did pass the video along to Nylah.

Suffice it to say that the Third Circuit’s reasoning finds little support in precedent or logic. As court after court, including the Third Circuit, has recognized, Section 230 forecloses civil or state criminal liability for platforms based on content furnished by third parties. Platforms cannot “be treated as the publisher or speaker of any information provided by another information content provider” — an immunity that goes beyond what prior First Amendment precedent requires.  That editorial choices enjoy free speech protection under the Constitution does not convert TikTok into the statutory publisher or speaker of the harmful content it surfaces. Indeed, it seems hard to believe Moody, a case striking down Florida and Texas laws regulating social media on First Amendment grounds, intended to confer less statutory protection than Congress provided. And if, as the concurrence reasons, a platform can be held liable just for the way it distributes content, then TikTok or YouTube or anyone could be held liable for alphabetizing content, let alone displaying it in accordance with popularity.

The basis of the wrongful death lawsuit against TikTok cannot be that TikTok is responsible for the “Blackout Challenge” video that led to Nylah’s death. Congress has ruled this out. But there may be alternative ways for courts to hold TikTok and others accountable for the harms they impose.

Judges do not make law, except when they do. Each common law cause of action in tort arose when an English or American court decided to recognize a new civil wrong for which the law provides a remedy. The advent of new technology often plays a role in setting these changes in motion. Negligence arguably owes its contemporary stature to the proliferation of the train. The privacy torts reacted to the invention of “[i]nstantaneous photographs” and “numerous mechanical devices [that] threaten to make good the prediction that ‘what is whispered in the closet shall be proclaimed from the house-tops.’” Trespass to chattels went “electronic” with the introduction of email and its nemesis, spam.

Plaintiffs are beginning to test the boundaries of tort law once again to fit social media. Seattle and other public-school districts recently sued TikTok, YouTube, and other platforms on the age-old theory of nuisance, arguing that these companies endanger public health by fostering a toxic online environment. When two boys died in a high-speed accident trying to trigger Snapchat’s “Speed Filter,” the Ninth Circuit allowed a cause of action to proceed against the company for negligent design. Snap could be held responsible for the “predictable consequences” of its irresponsible feature, the court reasoned, even though the “Speed Filter” always accompanied user-generated content. Washington election officials successfully sued Facebook, over its Section 230 objection, for failing to keep records on political ads in the state. The emphasis, again, was on Facebook’s own conduct around the ads, rather than the content of the ads themselves.

There is an admittedly fine line between attributing third party content to the platform, which federal law forbids, and holding the platform accountable for foreseeable harms to people and communities, which tort law encourages. What did TikTok do wrong in Anderson? They did not film or upload a dangerous challenge video, and they cannot be held liable for hosting, distributing, or even recommending it. But has TikTok invested enough time and resources in protecting children on the platform, especially considering what the company knows about the toxic content that appears there? Should families like Nylah’s be able to rely upon TikTok’s own community guidelines, which pledge to “[r]estrict content that is not suitable for youth”? Such questions sound less in derivative liability as non- and misfeasance. Section 230 was meant to be a shield, not a shibboleth. Courts should be trying to thread this needle, rather than pretending Section 230 does not exist. Obviously wrong interpretations of Section 230, like the Third Circuit’s in Anderson v. TikTok, Inc., only set the law back.

The post Courts Should Hold Social Media Accountable — But Not By Ignoring Federal Law appeared first on Harvard Law Review.


Viewing all articles
Browse latest Browse all 42

Trending Articles