JANUARY 12, 2021 DRAFT-NOT FOR CITATION OR DISTRIBUTION

Page created by Dawn Beck
 
CONTINUE READING
JANUARY 12, 2021 DRAFT-NOT FOR CITATION OR DISTRIBUTION
JANUARY 12, 2021 DRAFT—NOT FOR CITATION OR DISTRIBUTION

                                      Content Moderation Remedies
                                            By Eric Goldman*

                                                   Abstract

How and why Internet services moderate content has become a major social issue and the subject
of intense scrutiny. This article considers an area of content moderation that has gotten
comparatively little attention. When an Internet service determines that third-party content
violates its rules, what actions should it take? The longstanding default assumption is that
Internet services will remove problematic content or authors from their services, and some
statutes mandate that result. However, it turns out that Internet services have a wide range of
options (“remedies”) they can take in response to problematic content. The Article taxonomizes
over three dozen “remedies” available to Internet services to redress problematic third-party
content and authors. The Article then provides a normative framework to help Internet services
and regulators navigate this taxonomy so that they can develop the optimal remedies “strategy”
for their communities. By getting past the binary remove-or-not remedy concept that dominates
content moderation today, this Article should help improve the efficacy of content moderation,
promote free expression, promote more competition among Internet services, and improve
Internet services’ community-building functions.

*
  Associate Dean for Research, Professor of Law, and Co-Director of the High Tech Law Institute, Santa Clara
University School of Law. egoldman@gmail.com. http://www.ericgoldman.org. I was General Counsel of
Epinions.com, a consumer review service, from 2000-02. I appreciate the helpful comments from participants at
USENIX Free and Open Communications on the Internet Workshop (FOCI ’19); the Seventh Annual Computer
Science and the Law Workshop at the University of Pennsylvania Law School; the Princeton University Center for
Information Technology Policy Luncheon Series; the 9th Annual Internet Law Works-in-Progress Conference; the
Works in Progress in Intellectual Property (WIPIP) at University of Houston Law Center; the Faculty Workshop at
Santa Clara University School of Law; the Tel Aviv University Faculty of Law technology law seminar; the Law
and Technology Scholarship Seminar at UC Berkeley School of Law; the Faculty Workshop at University of San
Diego School of Law; Center for Intellectual Property Research’s International IP Colloquium at Indiana University
Maurer School of Law; and the Freedom of Expression Scholars Conference 2020 at Yale Law School; as well as
evelyn douek, Alex Feerst, James Grimmelmann, Oluwatomiwa Ilori, Thomas Kadri, Daphne Keller, Aleksandra
Kuczerawy, Judy Malloy, Enguerrand Marique, Yseult Marique, Jess Miers, Irina Raicu, Lisa Ramsey, Betsy
Rosenblatt, Colin Rule, Rebecca Tushnet, Rachel Wolbers, and Tal Zarsky. I am inspired by the work of Adelin Cai
and Clara Tsao. This project was supported in part by grants from the John S. and James L. Knight Foundation and
the Nebraska Governance and Technology Center's Summer Grant Program 2020.

                                                        1.
JANUARY 12, 2021 DRAFT-NOT FOR CITATION OR DISTRIBUTION
Table of Contents

Introduction .......................................................................................................................................
I. Project Context ..............................................................................................................................
     A. How This Project Enhances the Content Moderation Literature ...........................................
     B. The Nomenclature Problem ...................................................................................................
     C. The Inapplicability of the Traditional Remedies Literature ...................................................
     D. Analogies to Other Private Organizations ..............................................................................
II. Prior Literature on Content Moderation Remedies ......................................................................
     A. Statutes, Principles, and Policy Proposals ............................................................................
         1. DMCA Online Safe Harbors .............................................................................................
         2. E.U. E-Commerce Directive and Its Progeny ...................................................................
         3. The Manila Principles .......................................................................................................
         4. Santa Clara Principles .......................................................................................................
         5. The “Internet Balancing Test” ..........................................................................................
     B. Copyright Initiatives ...............................................................................................................
         1. Principles for User Generated Content Services ...............................................................
         2. Graduated Response/Copyright Alert System ..................................................................
III. A Taxonomy of Remedy Options ...............................................................................................
     A. Content Regulation ................................................................................................................
     B. Account Regulation ................................................................................................................
     C. Visibility Reductions ..............................................................................................................
     D. Monetary ................................................................................................................................
     E. Other .......................................................................................................................................
     F. Combining Remedies ..............................................................................................................
IV. Prioritizing Remedy Options ......................................................................................................
     A. Policy Levers ..........................................................................................................................
         1. Severity of the rule violation .............................................................................................
         2. Confidence that a rule violation actually occurred ...........................................................
         3. Scalability and consistency ...............................................................................................
         4. The community’s ability to self-correct .............................................................................
         5. How the remedies impact others .......................................................................................
         6. Retaining user engagement ...............................................................................................
         7. Parallel sanctions ..............................................................................................................
     B. Some (Tentative) Normative Views of How to Prioritize
        Content Moderation Remedies ..............................................................................................
         1. Avoid Mandatory One-Size-Fits-All Remedies ................................................................
         2. Some Internet Services Have Limited Remedy Options ..................................................
         3. Better Design to Discourage Problem Creation ................................................................
         4. Private Remedies Are (Usually) Preferable to Judicial Remedies ....................................
         5. Remedies Should Be Necessary and Proportionate ..........................................................
         6. Prefer Remedies that Empower Readers ...........................................................................
     C. Implications for “Platform” Transparency ............................................................................
Conclusion ........................................................................................................................................

                                                                         2.
Introduction

In May 2019, a President Trump supporter published a video of House Speaker Nancy Pelosi
which slowed down authentic footage without lowering the voice pitch,1 conveying the
inauthentic impression that Speaker Pelosi had delivered her remarks while intoxicated. This
video quickly became a viral sensation, spreading rapidly across the Internet.2

The hoax video raises many interesting policy questions,3 but this Article focuses on how three
major social media services—Facebook, Twitter, and YouTube—responded to the video. Often,
these three services reach the same conclusions about how to handle a controversial high-profile
item of content…but not in this case.

Instead, each service did something different with the Pelosi hoax video. Twitter left the video
up.4 YouTube removed the video.5 Facebook chose a third option: it allowed the video to remain
on its service, but it attempted to dissuade users from sharing it.6 Facebook users who attempted
to share the video were presented with this screen:7

1
  Kevin Poulsen, We Found The Guy Behind the Viral ‘Drunk Pelosi’ Video, THE DAILY BEAST, June 1, 2019,
https://www.thedailybeast.com/we-found-shawn-brooks-the-guy-behind-the-viral-drunk-pelosi-video.
2
  One version of the video was viewed two million times. Sue Halpern, Facebook’s False Standards for Not
Removing a Fake Nancy Pelosi Video, NEW YORKER, May 28, 2019, https://www.newyorker.com/tech/annals-of-
technology/facebooks-false-standards-for-not-removing-a-fake-nancy-pelosi-video.
3
  For example, there are substantial and legitimate concerns about “deepfake” videos that are completely fictional
but look authentic. See Robert Chesney & Danielle Keats Citron, Deep Fakes: A Looming Challenge for Privacy,
Democracy, and National Security, 107 CALIF. L. REV. 1753 (2019). The video of Nancy Pelosi wasn’t a “deepfake”
but instead a “cheap fake,” a euphemism for manipulated authentic videos. Britt Paris & Joan Donovan, Deepfakes
and Cheap Fakes: The Manipulation of Audio and Visual Evidence, DATA & SOC., Sept. 2019,
https://datasociety.net/wp-content/uploads/2019/09/DS_Deepfakes_Cheap_FakesFinal-1.pdf.
4
  Halpern, supra note XX.
5
  Emily Stewart, A Fake Viral Video Makes Nancy Pelosi Look Drunk. Facebook Won’t Take It Down., RECODE,
May 24, 2019, https://www.vox.com/recode/2019/5/24/18638822/nancy-pelosi-doctored-video-drunk-facebook-
trump.
6
  Halpern, supra note XX.
7
  Tweet of Donie O’Sullivan, May 25, 2019, https://twitter.com/donie/status/1132327255802294274. If you can’t
read the photo, Facebook’s pop-up warning says: “Before you share this content, you might want to know that there
is additional reporting on this from PolitiFact, 20 Minutes, Factcheck.org, Lead Stories and Associated Press” with
links to each of those sources.

                                                        3.
Facebook received heavy criticism for not removing the video,8 but its decision raises intriguing
possibilities. Ordinarily, we just assume that social media and other user-generated content
services choose between the binary options of leaving content up (like Twitter did) or remove
content (like YouTube did). Facebook showed that there are other options. What are these
options, and when might they be a helpful alternative to the standard binary choices?

                                                      ***

How Internet services remove, or decide to keep publishing, third-party content—a process
called content moderation9—has become a major issue in our society, and for good reason. The

8
  Donie O’Sullivan, Pelosi Calls Facebook a 'Shameful' Company That Helped in 'Misleading the American People',
CNN.COM, Jan. 16, 2020, https://www.cnn.com/2020/01/16/tech/pelosi-shameful-facebook/index.html.
9
  See, e.g., What is Content Moderation?, BESEDO, July 12, 2019, https://besedo.com/resources/blog/what-is-
content-moderation/ (“Content moderation is when an online platform screen and monitor user-generated content
based on platform-specific rules and guidelines to determine if the content should be published on the online
platform, or not”); James Grimmelmann, The Virtues of Moderation, 17 YALE J.L. & TECH. 42, 47 (2015) (defining
moderation as “the governance mechanisms that structure participation in a community to facilitate cooperation and
prevent abuse”); Shagun Jhaver, Amy Bruckman, & Eric Gilbert, Does Transparency in Moderation Really Matter?
User Behavior After Content Removal Explanations on Reddit, Proceedings of the ACM on Human-Computer
Interaction, Article 150 (Nov. 2019), https://dl.acm.org/doi/10.1145/3359252 (“Content moderation determines
which posts are allowed to stay online and which are removed, how prominently the allowed posts are displayed,
and which actions accompany content removals”).

                                                        4.
consequences of content moderation decisions can be quite significant. As the Pelosi hoax video
example shows, the Internet service’s decision could have major political consequences; and
content moderation can have life-changing consequences in myriad other ways.

Because of these high stakes, it is currently axiomatic that problematic user content online
should be removed (or depublished) as quickly as possible, and certainly after the service has
been notified of the problem. Consistent with those concerns, many laws require removal as the
mandatory obligation of services when remediating online content.

The “remove problematic content” presumption is so deep entrenched10 that it has overshadowed
the exploration of alternative “remedies” to problematic online content. That’s unfortunate
because there are actually many options available to handle problematic content other than
removal.

This Article explores this underexplored angle through two successive inquiries. The Article first
comprehensively taxonomizes dozens of “remedies” to redress problematic third-party content
online. Then, the Article addresses the related normative questions: how should this universe of
remedy options be prioritized? Which remedies are best, and why?

This Article advances the consideration of content moderation, and how to regulate it, in two
important ways. First, it helps expose a wider range of remedy options than are traditionally
acknowledged. We should not overrely on removals as the primary or exclusive remedy because
of the potentially significant collateral damage that can be caused by information suppression.11
Some circumstances call for a velvet glove rather than a sledgehammer.12 Expanding the remedy
tool kit allows for more tailored remedies that can balance the benefits and harms from continued
publication. This advances free expression while still redressing problematic content.13

Second, online communities have diverse audiences with idiosyncratic needs. An expanded
remedy toolkit will let Internet services refine and optimize their content moderation approaches
to best cater to their specific community’s needs.14 Indeed, the service’s remedy “strategy” can
become a key point of competitive differentiation. Services competing for the same audiences
can adopt differing strategies and let the audience choose which approach works better. Thus, an
expanded remedy toolkit can enhance marketplace competition and help services do a better job
catering to their audiences.

10
   MacKenzie F. Common, Fear the Reaper: How Content Moderation Rules Are Enforced on Social Media, 34
INT’L REV. L. COMPUTERS & TECH. 126 (2020),
https://www.tandfonline.com/doi/abs/10.1080/13600869.2020.1733762 (referring to the “obsession with removal”).
11
   TARLETON GILLESPIE, CUSTODIANS OF THE INTERNET 176 (2018) (removals “is the harshest approach, in terms of
its consequences… Removal is a blunt instrument, an all-or-nothing determination”).
12
   As Gillespie described it, “removing content or users is akin to the most profound kind of censorship.” Id. at 177.
13
   Molly K. Land & Rebecca J. Hamilton, Beyond Takedown: Expanding the Toolkit for Responding to Online Hate,
in PROPAGANDA, WAR CRIMES TRIALS AND INTERNATIONAL LAW: FROM COGNITION TO CRIMINALITY 143 (Predrag
Dojcinovic, ed. 2020).
14
   See Land & Hamilton, supra note __; evelyn douek, The Rise of Content Cartels, KNIGHT FIRST AMENDMENT
INSTITUTE, Feb. 11, 2020, https://knightcolumbia.org/content/the-rise-of-content-cartels (raising concerns about
cross-industry “cartels” that establish uniform content policies across the industry).

                                                          5.
The benefits of an expanded remedies toolkit are possible only with regulatory acquiescence.
Instead, when regulators routinely mandate removal as the remedy for problematic content, they
take away any discretion by Internet services to explore the full spectrum of potential remedies.
To avoid this, regulators should only require that Internet services take appropriate steps to
redress a problem, proportionate to the harm. Removal can be specified as one qualifying option,
but it should not be specified as the only option.

The process of content moderation has significant stakes for how we engage and communicate
with each other as a society. It’s crucial that we get the process of content moderation right.
Limiting the range of remedies available to redress problematic content will make it harder for us
to optimize and fine-tune content moderation processes to the degree necessary to achieve these
socially important goals.

The Article proceeds in four parts. Part I contextualizes the Article in the literature and explains
some unusual challenges with this research project. Part II demonstrates a variety of ways that
the binary remedy approach (leave up/remove) is hard-coded in the law and literature pertaining
to content moderation. Part III provides a comprehensive inventory of content moderation
remedies. Part IV explores how to choose between the options in Part III’s inventory by
enumerating various normative considerations.

I. Project Context

Before reaching the Article’s substantive discussion, this Part situates the project in the literature,
explains some of the unique challenges of this research project, and establish some project
boundaries.

Private Internet companies—like all other private companies—manage their service offerings.
The managerial powers of Internet companies include the ability to establish, implement, and
enforce a governance system for their customers’ online content and actions. In particular,
private companies take steps to redress problems with their customers’ content and actions. The
Article explains what steps are available to the companies and how they should decide which
steps are appropriate in specific circumstances.

A. How This Project Enhances the Content Moderation Literature

The social importance of content moderation has spurred the growth of academic analyses15 of,
and civil society statements16 about, content moderation. There is also an active literature
relating to “platform governance,” “algorithmic accountability,” and analogous topics, which
does not always directly address content moderation but often has substantial implications for

15
   E.g., GILLESPIE, supra note __; SARAH T. ROBERTS, BEHIND THE SCREEN: CONTENT MODERATION IN THE
SHADOWS OF SOCIAL MEDIA (2019); NICOLAS SUZOR, LAWLESS: THE SECRET RULES THAT GOVERN OUR DIGITAL
LIVES (2019); Kate Klonick, The New Governors: The People, Rules, and Processes Governing Online Speech, 131
HARV. L. REV. 1598 (2018).
16
   E.g., Manila Principles on Intermediary Liability, Version 1.0, March 24, 2015,
https://www.eff.org/files/2015/10/31/manila_principles_1.0.pdf; Santa Clara Principles,
https://santaclaraprinciples.org/.

                                                     6.
content moderation issues. Collectively, this precedent literature generally addresses one of three
topics:

Literature Topic 1: What content and activity is permissible online? These are the substantive
rules for content and activities, such as rules that child pornography and copyright infringement
are not permissible or that political speech is permitted. There are longstanding, ongoing, and
vigorous debates over what content and activities should be permitted online.

Literature Topic 2: Who should make the substantive rules of online content and activities? Rule-
making is a core function of government, and their rules can be expressed through official
substantive law—such as declaring certain content and activities as illegal or tortious—or “soft”
law, such as when regulators “jawbone”17 Internet companies to “voluntarily” redress legal but
unwanted or harmful content.18

Companies also create their own substantive rules for their services (what I call “house rules”).
House rules apply to content and activity that is legal but that the companies nevertheless decide,
as a matter of editorial policy, to restrict.

There are ongoing and vigorous debates over who should set the rules for online content and
activity—governments, companies, or others.

Literature Topic 3: Who determines if a rule violation has occurred, and who hears any appeals
of those decisions? Traditionally, courts or other government entities have played a preeminent
role, at least with respect to legally significant decisions. In contrast, with respect to online
content or actions, Internet services make virtually all of their own decisions about rule
violations (though sometimes they abide the decisions of by independent third parties).19 There
are ongoing and vigorous debates over who should determine if a rule violation (whether it’s a
house rule or government-mandates) has occurred and hear appeals.

The Underexplored Fourth Topic: This Article does not directly engage any of three topics.
Instead, it addresses a fourth topic that has received much less attention during the debates over
the other three questions. The topic of this Article is: assuming that a rule violation has taken
place, what steps should the service take to redress the violation?20 In other words, what are the

17
   Derek E. Bambauer, Against Jawboning, 100 MINN. L. REV. 51 (2015). This is also called “working the ref.” E.g.,
Eric Alterman, The Right Is Working the Ref Yet Again. This Time on Facebook—and It’s Working, THE NATION,
Aug. 16, 2018, https://www.thenation.com/article/archive/the-right-is-working-the-ref/.
18
   For example, the U.K. wants Internet services to eliminate lawful but harmful content (sometimes colloquially
described as “lawful but awful” content). Online Harms White Paper, U.K. Secretary of State for Digital, Culture,
Media & Sport and the Secretary of State for the Home Department, Apr. 2019,
https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/793360/Online_H
arms_White_Paper.pdf; Eric Goldman, The U.K. Online Harms White Paper and the Internet’s Cable-ized Future,
16 OHIO STATE TECH. L.J. 351 (2020).
19
   E.g. Ripoff Report’s Arbitration Program, https://www.ripoffreport.com/arbitration (allowing businesses to redact
portions of negative reviews if a third-party arbitrator agrees); Facebook’s Independent Oversight Board (sometimes
colloquially called the “Facebook Supreme Court”), https://about.fb.com/news/2019/09/oversight-board-structure/.
Online dispute resolution (ODR) can also play a role here.
20
   See DOUGLAS LAYCOCK & RICHARD L. HASEN, MODERN AMERICAN REMEDIES: CASES AND MATERIALS 2
(Concise 5th ed. 2018) (“In every case, we will assume that defendant’s conduct is unlawful and ask what the court

                                                         7.
appropriate “remedies” for a rule violation? (The term “remedies” has its own problems I’ll
discuss momentarily).

Admittedly, the remedies topic cannot be addressed in isolation of the other three topics. The
legitimacy of any implemented remedy will depend, in part, on the legitimacy of the underlying
content moderation system, including the rules, who set them, and how violations were
determined. If the system of content moderation lacks legitimacy, any associated remedies
scheme will too. Because of the close interplay between the content moderation structure and the
remedies scheme, addressing the remedies issue in isolation inevitably feels a little hollow.

This Article nevertheless isolates the remedies questions from the other content moderation
topics for two reasons. First, by shining the light on this topic, it raises the profile of this issue
that might otherwise get secondary consideration. Second, this Article can do a more nuanced
and detailed review of the remedies than it could if it attempted to comprehensively address
content moderation.21 However, Part IV will partially relax the Article’s single-minded focus on
remedies to reconsider the interplays between remedies and other aspects of content moderation.

B. The Nomenclature Problem

Referring to the consequences of rule violations as “remedies” creates at least two semantic
problems. First, the term “content moderation remedies” is implicitly redundant. “Moderation”
of content is itself a “remedy.” It might be more precise to describe this Article’s topic as
“content moderation actions,”22 which makes more explicit that “moderation” is a euphemism
for a range of possible outcomes.

Second, the term “remedies” implies that someone is benefiting from the action. For example, in
court, a judge crafts remedies that redress a successful litigant’s issue. With content moderation,
it may not be clear who the analogous beneficiary is—especially when no one complained to the
Internet service about the content. Sometimes, an Internet service self-initiates content
moderation in response to a violation of its house rules. In those cases, it’s possible that the
service is the purported “victim” and is engaging in self-help. This is counterintuitive; we are
tempted to consider some harmed third party (say, a copyright owner or defamed individual) as
the victim, when in fact it’s the service who is the most direct beneficiary of any remediation. It
feels weird to call that outcome a “remedy.”

Furthermore, “content moderation” is a synonym for an Internet service’s editorial practices.
Internet companies publish third party content, so content management decisions are editorial

can do about it: What does plaintiff get? How much does he get? Why does he get that instead of something more,
or less, or entirely different?”).
21
   See LAYCOCK & HASEN, supra note __, at 7 (“Whether we design remedies that encourage profitable violations,
or remedies that seek to minimize violations, or remedies that serve some other purpose altogether, we are making
choices distinct from the choices we make when we design the rest of the substantive law.”).
22
   Internet services now sometimes use “action” as a verb for their content decisions. For example, upon removing a
group, Pinterest explained the group “was actioned and labeled for misinformation, specifically conspiracies and
health misinformation.” See Jason Koebler, Pinterest Bans Anti-Abortion Group Live Action for Posting
Misinformation, VICE.COM, June 12, 2019, https://www.vice.com/en_us/article/ywyx7g/pinterest-bans-anti-
abortion-group-live-action-for-posting-misinformation.

                                                         8.
decisions.23 This suggests that content moderation remedies are “editorial tools” that companies
use in their publication process.

Despite these semantic problems, I have stuck with the “remedies” characterization as a
convenient shorthand descriptor for this set of editorial activities. No other descriptor fit better.24

C. The Inapplicability of the Traditional Remedies Literature

There is a rich and venerable academic literature about “remedies” for legal violations. At the
highest level of generality, this Article shares the same structural goals. The criminal justice
system advances the social goals of punishment/retribution, deterrence, incapacitation
(segregating dangerous individuals from the rest of the community), rehabilitation, and
expressive justice;25 additional remedial goals include victim restitution. Those normative values
should influence content moderation design as well.26

23
   Compare Hassell v. Bird, 247 Cal. App. 4th 1336 (Cal. App. Ct. 2016), rev’d 5 Cal. 5th 522 (2018) (describing
Yelp as a speech “administrator”).
24
   Marique & Marique adopted the term “sanctions.” Enguerrand Marique & Yseult Marique, Sanctions on Digital
Platforms: Balancing Proportionality in a Modern Public Square, 36 COMPUTER L. & SECURITY REV. 105372
(2020). They define “sanctions” as “the exercise of power and taken by digital operators towards undesirable
behavior on the modern public square. Sanctions react to a specific problematic behavior defined as such by a
socially recognized rule.” However, for purposes of this Article, this overemphasizes punitive aspects, when
remedies do not need to be punitive in nature.
25
   E.g., SANFORD H. KADISH & STEPHEN J. SCHULHOFER, CRIMINAL LAW AND ITS PROCESSES: CASES AND
MATERIALS 101-53 (6th ed. 1995); JOSHUA DRESSLER, UNDERSTANDING CRIMINAL LAW, ch. 2 (7th ed. 2015); see
also Randy E. Barnett, Restitution: A New Paradigm of Criminal Justice, 87 ETHICS 279 (1977); Richard A. Posner,
Retribution and Related Concepts of Punishment, 9 J. LEG. STUDIES 71 (1980). Congress has codified its normative
values for federal criminal sentencing:
          The court, in determining the particular sentence to be imposed, shall consider—
          (1) the nature and circumstances of the offense and the history and characteristics of the defendant;
          (2) the need for the sentence imposed—
                    (A) to reflect the seriousness of the offense, to promote respect for the law, and to provide just
                    punishment for the offense;
                    (B) to afford adequate deterrence to criminal conduct;
                    (C) to protect the public from further crimes of the defendant; and
                    (D) to provide the defendant with needed educational or vocational training, medical care, or other
                    correctional treatment in the most effective manner;
          (3) the kinds of sentences available;…
          (6) the need to avoid unwarranted sentence disparities among defendants with similar records who have
          been found guilty of similar conduct; and
          (7) the need to provide restitution to any victims of the offense.
18 U.S.C. § 3553(a).
          Civil remedial schemes are similar. Laycock & Hasen categorize civil remedies into the following
categories: compensatory remedies, preventive remedies (including coercive and declaratory remedies),
restitutionary remedies, punitive remedies, and ancillary remedies. LAYCOCK & HASEN, supra note __, at 2.
          See generally Marique & Marique, supra note __ (discussing how “sanctions” can be “retributive,”
“reparative,” or “pedagogic”).
26
   E.g., Sarita Schoenebeck, Oliver L Haimson, & Lisa Nakamura, Drawing from Justice Theories to Support
Targets of Online Harassment, NEW MEDIA & SOCIETY, Mar. 25, 2020,
https://journals.sagepub.com/doi/full/10.1177/1461444820913122 (discussing how criminal justice theories can
inform content moderation).

                                                          9.
However, in general, the remedies literature doesn’t help address the questions raised by this
Article. This Article considers remedies that are determined and effectuated by private Internet
companies, not state actors like government-operated institutions.27 This difference is key:

Accountability. The government imposes its rules on its citizens whether they agree or not.
Citizens must honor the government-set rules that apply to them, and usually they must pay the
applicable taxes that fund government services such as a judicial system. Citizens have a voice in
this governance through their right to vote.

Private companies are categorically different. They can’t impose taxes; they cannot compel rule
compliance through tax-funded police powers; they cannot be voted out. Most importantly,
citizens are not forced to use them. As a result, their remedy schemes have different
accountability mechanisms and different impacts.

Unavailability of Certain Remedies. This Article focuses only on actions that private companies
can take against online content or actions. As a result, some of the harshest remedies that can be
deployed by state actors are categorically unavailable to the private companies. For example,
private companies cannot directly garnish a person’s wages;28 seize their physical assets; remove
a child from a parent’s custody; incarcerate a person or otherwise deprive them of their physical
freedom; or impose capital punishment.

More generally, Internet services can only regulate behavior within their virtual premises;29
remedies affecting the world outside the Internet service typically require the voluntary
cooperation of others. Because the intersection between the virtual premises and a non-compliant
user may be relatively limited, the Internet service has a far more limited toolkit of remedy
options than a government actor, which can affect virtually every aspect of a person’s life.

Constitutional Limits. Because of citizens’ lack of choice and the extraordinary police powers
vested in governments, the Constitution protects citizens by substantially restricting how the
government can use its coercive powers.30 None of those considerations apply in the private
sector context. Due to their fundamentally different role in our society, private entities are not
subject to those Constitutional restrictions. Indeed, courts routinely reject efforts by plaintiffs to
impose Constitutional obligations on Internet companies predicated on the argument that they are
the synthetic equivalent of governments.31

27
   E.g., LAYCOCK & HASEN, supra note __, at 1 (“A remedy is anything a court can do for a litigant who has been
wronged or is about to be wronged”) (emphasis added).
28
   With the exception that when a service is paying the alleged offender, it can stop payment, an option considered in
Part III.
29
   Jennifer L. Mnookin, Virtual(ly) Law: The Emergence of Law in LambdaMOO, J. COMPUTER-MEDIATED COMM.,
June 1996, https://academic.oup.com/jcmc/article/2/1/JCMC214/4584334 (discussing how LambdaMOO
intentionally limited its remedial system to in-world consequences).
30
   See Developments in the Law: Alternatives to Incarceration, 111 HARV. L. REV. 1863, 1950-55 (1998) (discussing
constitutional challenges to incarceration alternatives).
31
   “[C]ase law has rejected the notion that private companies such as Facebook are public fora….[S]imply because
Facebook has many users that create or share content, it does not mean that Facebook…becomes a public forum.”
Federal Agency of News LLC v. Facebook, Inc., 2020 WL 137154 (N.D. Cal. Jan. 13, 2020); see also Prager
University v. Google LLC, 2018 WL 1471939 (N.D. Cal. March 26, 2018); Buza v. Yahoo, Inc., 2011 WL 5041174
(N.D. Cal. Oct. 24, 2011); Langdon v. Google, Inc., 474 F. Supp. 2d 622 (D. Del. 2007); Eric Goldman, Of Course

                                                         10.
Much of the academic literature on remedies assumes that the remedies will be determined and
implemented by state actors. Private actors, with their structurally different attributes, raise
different considerations that are not contemplated by the standard remedies literature.32

The Laws of Physics Don’t Apply. Government’s coercive powers are intrinsically constrained by
the laws of physics. For example, governments cannot incarcerate a person who is physically
absent. Or, in the media context, a publisher has limited post-publication options against the
physical copies of their works once those copies have left the publishers’ inventory. In contrast,
physics do not constrain Internet services’ remedies. Instead, the range of possible online
remedies are constrained only by the technical limits of the underlying software code.33 For
example, Internet services can create and impose the remedy of “toading,”34 which converts a
game player’s avatar into a virtual toad with restricted player functionality. Obviously, toading is
impossible in the offline world. This shows how Internet services have an expanded universe of
potential remedies that let Internet services develop creative remedies without historical
precedents.

D. Analogies to Other Private Organizations

The previous subpart explained why the Article addresses different issues than the traditional
remedies imposed by governments. A better analogy would be to other private institutions that
regulate the behavior of their stakeholders. For example, companies often have explicit or de
facto policies about terminating services for customer misbehavior.35

An even better analogy is how membership organizations—such as fraternities and sororities,
religious organizations, professional associations, and sports leagues—deploy private
disciplinary systems for member misbehavior.36 To build these disciplinary systems, these
membership organizations decide what remedies to impose for member misbehavior, such as
how a football or baseball league decides what consequences to impose when a player tests
positive for banned drugs, is convicted of domestic violence, or engages in unsportsmanlike
conduct in-game. The membership organizations have a variety of tools—“remedies”—to
respond to member misbehavior, such as imposing fines or suspending membership. These
considerations partially resemble the decision-making by private Internet companies for
“misbehavior” by their users.

the First Amendment Protects Google and Facebook (and It’s Not a Close Question), a response to Heather
Whitney’s paper, Search Engines, Social Media, and the Editorial Analogy, Knight First Amendment Institute’s
Emerging Threats series, Feb. 2018, https://ssrn.com/abstract=3133496.
32
   For a deeper look at this issue, see Maayan Perel, Digital Remedies, 35 BERKELEY TECH. L.J. 1 (2020) (discussing
the problems when courts delegate responsibility for implementing equitable relief to private Internet companies).
33
   LAWRENCE LESSIG, CODE AND OTHER LAWS OF CYBERSPACE (1999).
34
   See generally Mnookin, supra note __, at n.44.
35
   Where the customer’s “misbehavior” merely could be unprofitability to the business. LARRY SELDEN & GEOFFREY
COLVIN, ANGEL CUSTOMERS AND DEMON CUSTOMERS: DISCOVER WHICH IS WHICH AND TURBO-CHARGE YOUR
STOCK (2003).
36
   Mnookin, supra note __ (noting the analogy to “social clubs”). Cf. Eric Schlachter, Cyberspace, the Free Market,
and the Free Marketplace of Ideas: Recognizing Legal Differences in Computer Bulletin Board Functions, 16
HASTINGS COMM. & ENT. L.J. 87 (1993) (discussing membership organizations as an analogy for the governance of
BBSes).

                                                        11.
Thus, private Internet companies could learn a lot from how membership organizations build and
implement remedies for member misbehavior. Unfortunately for my ability to explore that
analogy further, I did not find any precedent literature explaining how membership organizations
should design their remedies. That appears to be an open research topic.

In sum, as this subpart indicates, I did not find a good precedent literature addressing the issue I
wanted to address. The existing remedies literature addresses a structurally different set of
entities; the closer analogy of membership organizations lacks an academic literature about
remedy design that I could compare/contrast. Part IV will tackle the issue of private entity
remedy design, but unfortunately without the benefit of relying upon an independent analytical
framework.

II. Prior Literature on Content Moderation Remedies

Regulators, academics, and civil society have widely conceptualized content moderation
remedies as binary on/off switches. They have assumed that content stays up or comes down and
that accounts stay active or are terminated. This Part will provide examples of their approaches
to remedies.

A. Statutes, Principles, and Policy Proposals

Regulators have hard-wired the binary approach to content moderation in dozens or hundreds of
laws throughout the world.37 Civil society entities have issued principles to regulators to help
guide their development of Internet law; and they too have encoded binary thinking about
remedies. This subpart provides four examples of the pervasiveness and uncritical reflection
about binary content moderation remedies:

1. DMCA Online Safe Harbors

In 1998, Congress enacted the Digital Millennium Copyright Act (DMCA), including a safe
harbor for hosting services codified at 17 U.S.C. §512(c) often called the “notice-and-takedown”
provision. The safe harbor incorporates binary remedies in two ways: removal of individual files
and termination of recidivist accounts.38

First, the 512(c) safe harbor contemplates that copyright owners will notify hosts about allegedly
infringing user uploads. As a precondition to the safe harbor, the host then must expeditiously
“remove[] or disable access to” user-uploaded files following proper notice from the copyright
owners.39 The net effect is that 512(c) drives hosts towards a single remedy in the event they are

37
   For an analogous discussion of the hard-wired binary approach to remedies in the criminal context, see, e.g., Dan
M. Kahan, What Do Alternative Sanctions Mean?, 63 U. CHI. L. REV. 591 (1996); Developments in the Law:
Alternatives to Incarceration, 111 HARV. L. REV. 1863 (1998).
38
   Grimmelmann, supra note __, at 107.
39
   17 U.S.C. §§ 512(c)(1)(A)(iii) & 512(c)(1)(C).

                                                         12.
notified of claimed user infringement—they must remove the identified files if they want the safe
harbor.40

Second, Internet services wanting the Section 512 safe harbor must also terminate users “who are
repeat infringers.”41 This requires hosts to track users who engage in the qualifying infringing
behavior,42 sometimes called issuing “strikes.”43 If a user accumulates too many strikes, the
user’s account must be terminated to remain eligible for the safe harbor.44 The safe harbor
doesn’t specify when a user is an “infringer” (must a court adjudge the user as an infringer, or
will copyright owners’ unsubstantiated allegations suffice?), nor does the safe harbor specify a
minimum number of strikes before a user qualifies as a “repeat” infringer.45

2. E.U. E-Commerce Directive and Its Progeny

The European Union adopted its “E-Commerce Directive”46 in 2000 in the wake of the DMCA’s
passage. The directive reflected similar principles as the DMCA online safe harbor. However,
unlike the DMCA online safe harbor, which only pertained to alleged copyright infringement,
the E-Commerce Directive applies to all types of illegal or tortious material. Like the DMCA
online safe harbor, the E-Commerce Directive contemplated a notice-and-takedown scheme
where hosts would remove or disable access in response to takedown notices.47

The E-Commerce Directive’s notice-and-takedown scheme has propagated throughout Europe in
a variety of incarnations. For example, in 2017, Germany passed the
Netzwerkdurchsetzungsgesetz, the Network Enforcement Act, commonly referred to as
“NetzDG.” NetzDG combats certain types of socially harmful content by requiring that hosts
“remove or block access” to verboten content within certain time periods.48 Similarly, the U.K.
Defamation Act requires web hosts to remove allegedly defamatory user statements within 48

40
   The statute also references “disabling access.” It’s unclear how this differs from removal. Perhaps the host does
not delete the file on its servers but still renders it unavailable to anyone. If so, the net effect for users would be the
same. I am not aware of any hosts that disable access rather than remove files, nor am I aware of any cases
interpreting this phrase.
41
   17 U.S.C. § 512(i)(1)(A).
42
   E.g., Ventura Content, Ltd. v. Motherless, Inc., 885 F.3d 597 (9th Cir. 2018).
43
   Shoshana Wodinsky, YouTube’s Copyright Strikes Have Become a Tool for Extortion, THE VERGE, Feb. 11, 2019,
https://www.theverge.com/2019/2/11/18220032/youtube-copystrike-blackmail-three-strikes-copyright-violation.
44
   A number of services adopted (at least at some point in their history) a three-strikes-and-you’re-out policy,
including YouTube and Tumblr. See Melanie Ehrenkranz, YouTube Updates Its Three-Strikes Policy—But Not the
One You're Mad About, GIZMODO, Feb. 19, 2019, https://gizmodo.com/youtube-updates-its-three-strikes-policy-but-
not-the-on-1832726224; Jonathan Bailey, Don’t Blame the DMCA for Tumblr’s Policy, PLAGIARISM TODAY, June
23, 2015, https://www.plagiarismtoday.com/2015/06/23/dont-blame-the-dmca-for-tumblrs-policy/. Giganews had a
two-strikes policy. https://www.giganews.com/legal/dmca.html.
45
   But see BMG Rights Mgmt (US) LLC v. Cox Comms. Inc., 881 F.3d 293 (4th Cir. 2018) (a 13-strike policy was
too lax to satisfy the DMCA’s requirements).
46
   Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of
information society services, in particular electronic commerce, in the Internal Market, https://eur-
lex.europa.eu/legal-content/EN/ALL/?uri=CELEX:32000L0031 (the “E-Commerce Directive”).
47
   E-Commerce Directive Article 14(1)(b).
48
   See generally Heidi Tworek & Paddy Leerssen, An Analysis of Germany’s NetzDG Law, TransAtlantic Working
Group, Apr. 15, 2019, https://www.ivir.nl/publicaties/download/NetzDG_Tworek_Leerssen_April_2019.pdf.

                                                            13.
hours of a takedown notice unless the host provides the user’s identifying information to the
complainant.49

3. The Manila Principles

The Manila Principles on Intermediary Liability50 are designed to help “policymakers and
intermediaries when developing, adopting, and reviewing legislation, policies and practices that
govern the liability of intermediaries for third-party content.”51

While the Manila Principles provide substantial guidance on these topics, the principles do not
spend much energy on the remedies question. The Manila Principles implicitly assume content
removals are the default remedy. For example, one of the six main Manila Principles says:
“Laws and content restriction orders and practices must comply with the tests of necessity and
proportionality.”52 This translates into several specific expectations that relate to remedies:

        “courts should only order the removal of the bare minimum of content that is necessary to
         remedy the harm identified”;53
        Companies should adopt “the least restrictive technical means” of restricting content;54
        Companies should deploy geographically variegated content restrictions, so that
         restrictions are as geographically limited as possible;55 and
        Companies should deploy the most temporally limited content restrictions.56

The Manila Principles refer to “content restrictions” rather than the more specific term “content
removals,” and that preserves space for other remedies. Nevertheless, the Manila Principles do
not adequately explore any other options. For example, four of the five examples describing
“content restrictions” explicitly relate to content removals or takedowns.57

4. Santa Clara Principles

In 2018, some civil society organizations and academics issued the Santa Clara Principles on
Transparency and Accountability in Content Moderation.58 The principles address: what good
transparency reports contain; how companies should provide detailed notices to users when
taking actions; and the availability of user appeals for those actions. The principles explicitly
discuss content removals and account suspensions, but they do not otherwise address remedies.
49
   UK Statutory Instruments, The Defamation (Operators of Websites) Regulations 2013, 2013 No. 3028,
https://www.legislation.gov.uk/uksi/2013/3028/pdfs/uksi_20133028_en.pdf.
50
   Manila Principles on Intermediary Liability, Version 1.0, March 24, 2015,
https://www.eff.org/files/2015/10/31/manila_principles_1.0.pdf.
51
   Manila Principles at 1.
52
   Manila Principles at 4.
53
   The Manila Principles on Intermediary Liability Background Paper, Version 1.0, 30 May 2015,
https://www.eff.org/files/2015/07/08/manila_principles_background_paper.pdf at 35.
54
   Manila Principles Background Paper at 36.
55
   Manila Principles Background Paper at 39.
56
   Manila Principles Background Paper at 40.
57
   Manila Principles Background Paper at 16-17. The fifth example is “notice-and-notice,” where a service forwards
a takedown notice to the targeted content uploader but otherwise takes no action. See Canada Copyright Act § 41.26.
58
   https://santaclaraprinciples.org/.

                                                        14.
5. The “Internet Balancing Formula”

In 2019, European law professor Mart Susi proposed the “Internet Balancing Formula.”59 The
formula seeks to determine when the free expression value of content outweighs reasons to
suppress the content, such as privacy interests. It assigns numerical values to various factors,
some in favor of free expression and others in favor of content suppression, and computes a
precise fraction. If less than 1, the content should not be restricted because of its free expression
value; if great than 1, the content “should not be published or should be blocked.”60

This formula builds upon the legal baseline provided by the E.U. E-Commerce Directive, which
(as discussed supra) mandates the removal of tortious content. Thus, the formula treats removal
as the only applicable remedy. Interestingly, the formula actually could have signaled the virtue
of alternative remedies in close cases. For example if the formula produced a result between 0.5
and 2, perhaps the closeness of the question would counsel something other than removal. Part
IV will revisit the relevance of close questions when deciding the appropriate remedies.

B. Copyright Initiatives

In contrast to the examples from the prior subpart, copyright has seen more innovative thinking
involving content moderation remedies.

1. Principles for User Generated Content Services

In 2007, some copyright owners announced “Principles for User Generated Content Services.”61
These principles sought to induce “services providing user-uploaded and user-generated audio
and video content” to work harder to prevent user-caused copyright infringement. The carrot:
signatories agreed not to sue services for copyright infringement if they met the principles’ very
exacting requirements.62 Those requirements include blocking users’ uploads that matched a
database of precedent works, unless the copyright owner “wishes to exercise an alternative to
blocking (such as allowing the content to be uploaded, licensing use of the content or other
options).”63 This reference to blocking alternatives seems to be an early and innovative
recognition that remedies don’t have to be binary.

Only a few user-generated content services joined the principles, and for good reason. One
signatory, Veoh, already was entitled to the DMCA online safe harbor;64 yet signing the

59
   Mart Susi, The Internet Balancing Formula, 25 EUROPEAN L.J. 198 (2019) [hereinafter Susi, Balancing]; see also
Robert Alexy, Mart Susi’s Internet Balancing Formula, 25 EUROPEAN L.J. 213 (2019); Mart Susi, Reply to Robert
Alexy’s Critique of the Internet Balancing Formula, 25 EUROPEAN L.J. 221 (2019).
60
   Susi, Balancing, supra note __, at 207.
61
   Principles for User Generated Content Services, http://ugcprinciples.com/. See also Note, The Principles For User
Generated Content Services: A Middle-Ground Approach To Cyber-Governance, 121 HARV. L. REV. 1387 (2008).
62
   Principles for User Generated Content Services ¶ 14.
63
   Principles for User Generated Content Services ¶ 3(c).
64
   UMG Recordings, Inc. v. Shelter Capital Partners LLC, 667 F.3d 1022 (9th Cir. 2011).

                                                        15.
principles did not keep Veoh from being sued into oblivion.65 As a result, these principles have
been de facto abandoned, and their potentially provocative thoughts about remedies never got
fully explored.

2. Graduated Response/Copyright Alert System

In the late 2000s, copyright owners sought to deputize Internet access providers (IAPs) to
discourage copyright infringement by their subscribers. This led to initiatives, sometimes called
“Graduated Response,”66 which imposed increasingly stringent penalties against IAP subscribers
who kept infringing copyright by file-sharing.

IAPs’ remedy options differ from other services, like web hosts, in a few important ways. First,
IAPs cannot control individual content items disseminated by subscribers (except through
disfavored and partially-effective techniques like deep packet inspection),67 so IAPs have fewer
remedy options. Second, IAP restrictions on subscribers’ accounts can hinder their ability to use
the Internet categorically; the remedy may not apply only to the problematic usage. In contrast,
the remedies applied by any individual web service normally do not affect the ability to enjoy
other services.68 Internet access is functionally a necessity in our modern society,69 so limiting
Internet access is more likely to disproportionately diminish a subscriber’s life than actions by
individual services.

Despite these differences, IAP responses to subscribers’ alleged copyright infringement has
generated some interesting remedies.

Graduated Response (Riposte Graduée) in France

France adopted a graduated response program called “HADOPI,” named for the government
agency charged with its enforcement.70 It is commonly called the “Three Strikes” law to denote
the number of infringement claims before bad things happen to the IAP subscriber:71

        Strike 1: email warning
        Strike 2: warning sent in the mail

65
   Eric Goldman, UMG v. Shelter Capital: A Cautionary Tale of Rightsowner Overzealousness, TECH. & MKTG. L.
BLOG, Dec. 20, 2011, https://blog.ericgoldman.org/archives/2011/12/umg_v_shelter_c.htm.
66
   E.g., Peter K. Yu, The Graduated Response, 62 FLA. L. REV. 1373 (2010).
67
   See Catherine J.K. Sandoval, Disclosure, Deception, and Deep-Packet Inspection: The Role of the Federal Trade
Commission Act's Deceptive Conduct Prohibitions in the Net Neutrality Debate, 78 FORDHAM L. REV. 641 (2009).
68
   But see Kashmir Hill, I Tried to Block Amazon From My Life. It Was Impossible, GIZMODO, Jan. 22, 2019,
https://gizmodo.com/i-tried-to-block-amazon-from-my-life-it-was-impossible-1830565336.
69
   E.g., Frank La Rue, Report of the Special Rapporteur on the Promotion and Protection of the Right to Freedom of
Opinion and Expression, ¶85, May 16, 2011,
https://www2.ohchr.org/english/bodies/hrcouncil/docs/17session/A.HRC.17.27_en.pdf (“the Internet has become an
indispensable tool for realizing a range of human rights, combating inequality, and accelerating development and
human progress”).
70
   The agency is “Haute Autorité pour la Diffusion des Oeuvres et la Protection des droits d'auteur sur Internet.”
71
   See Sandrine Rambaud, Illegal Internet File Downloads Under HADOPI 1 and 2, 15 No. 6 CYBERSPACE LAW. 10

                                                       16.
You can also read