Who Moderates the Social Media Giants? - A Call to End Outsourcing PAUL M. BARRETT - Squarespace

Page created by April Fowler
 
CONTINUE READING
Who Moderates the Social Media Giants? - A Call to End Outsourcing PAUL M. BARRETT - Squarespace
Who Moderates the
Social Media Giants?
A Call to End Outsourcing

PAUL M. BARRETT

         Center for Business
         and Human Rights      June 2020
Contents

                                            Executive Summary..................................................................................... 1

                                            1. Introduction............................................................................................. 3

                                                Sidebar: The Coronavirus Pandemic and Content Moderation................ 6

                                            2. The Origins and Development of Content Moderation.............................. 7

                                                Sidebar: Ad Hoc Policies: From COVID-19 to Holocaust Denial.............. 9

                                            3. The Moderator’s Experience.................................................................. 12

                                            4. Content Moderation and Volatile Countries............................................ 19

                                                Sidebar: Many Frauds, Not Enough Fact-Checkers............................... 23

                                            5. Recommendations................................................................................ 24

                                            Endnotes................................................................................................... 27

                                            Acknowledgments
                                            We extend special thanks to researchers and Stern Signature Project participants
                                            Abhinav Krishna, Tiffany Lin, and Su-Kyong Park.
                                            Thanks also to the following for their time and insights:
                                            Facebook: Monika Bickert, Ruchika Budhraja, Arun Chandra, Nick Clegg, Crystal Davis,
                                            Sarah Oh, Maxime Prades, Drew Pusateri, Guy Rosen, Miranda Sissons
                                            Google/YouTube: Markham Erickson, Mike Grosack, Alex Joseph, Radha Penekelapati,
                                            Alexandria Walden, Clement Wolf
                                            Twitter: Del Harvey, Nick Pickles
                                            Other: Joshua Brustein of Bloomberg News; Sean Burke; Adelin Cai, formerly of
                                            Pinterest, Twitter, and Google; Cori Crider of Foxglove; Yael Eisenstat of Cornell Tech
                                            and formerly of Facebook; Debrynna Garrett; Christopher Gray; Sam Gregory of Witness;
                                            Jennifer Grygiel of Syracuse University; Clifford Jeudy; Sharon Kann of Media Matters for
                                            America; David Kaye of the United Nations and the Law School of the University of California,
                                            Irvine; Daphne Keller of Stanford Law School and formerly of Google; Jason Kint of Digital
                                            Content Next; Kate Klonick of St. John’s Law School; Jay Lechner of Lechner Law;
                                            Roger McNamee, formerly of Elevation Partners; Casey Newton of The Verge; Britt Paris
                                            of Rutgers University; Tom Phillips, formerly of Google; Lawrence Pintak of Washington
                                            State University; Karen Rebelo of Boom Live; Sarah Roberts of the University of California,
                                            Los Angeles; Aaron Sharockman of PolitiFact; Diane Treanor of Coleman Legal Partners;
                                            Dave Willner of Airbnb and formerly of Facebook; Nicole Wong, formerly of Google and
Author                                      Twitter; Jillian York of the Electronic Frontier Foundation; Valera Zaicev.
Paul M. Barrett is the Deputy Director of
the New York University Stern Center for    This report was made possible by generous support from the John S. and James L. Knight
Business and Human Rights.                  Foundation and Craig Newmark Philanthropies.
Executive Summary

      Content moderation—the process of deciding what stays online and what gets taken down—
      is an indispensable aspect of the social media industry. Without it, online platforms would be
      inundated not just by spam, but by personal bullying, neo-Nazi screeds, terrorist beheadings,
      and child sexual abuse.

      Despite the centrality of content moderation, however, major social media companies have marginalized
      the people who do this work, outsourcing the vast majority of it to third-party vendors. A close look at this
      situation reveals three main problems:
      — In some parts of the world distant from Silicon Valley, the marginalization of content moderation has led
         to social media companies paying inadequate attention to how their platforms have been misused to
         stoke ethnic and religious violence. This has occurred in places ranging from Myanmar to Ethiopia.
         Facebook, for example, has expanded into far-flung markets, seeking to boost its user-growth numbers,
         without having sufficient moderators in place who understand local languages and cultures.
      — The peripheral status of moderators undercuts their receiving adequate counseling and medical care
         for the psychological side effects of repeated exposure to toxic online content. Watching the worst
         social media has to offer leaves many moderators emotionally debilitated. Too often, they don’t get
         the support or benefits they need and deserve.
      — The frequently chaotic outsourced environments in which moderators work impinge on their decision-
         making. Disputes with quality-control reviewers consume time and attention and contribute to a
         rancorous atmosphere.
      Outsourcing has become a common aspect of the globalized economy. Examples include customer-help
      centers in the Philippines, digital device factories in China, and clothing-production facilities in Bangladesh.
      Outsourcing is not inherently detrimental—if workers are paid fairly and treated humanely. A central question
      raised by outsourcing, in whatever industry it occurs, is whether it leads to worker exploitation. In social media,
      there’s an additional concern about whether outsourcing jeopardizes optimal performance of a critical function.

         How Facebook handles harmful content

            Violates community standards?                Content                         Misinformation?

             Flagged by AI       Reported by user                             Flagged by AI               Reported by user

                    Content moderator                                                         Fact-checker

             Yes, violates           No, doesn’t
              community          violate community                                  Yes,                         No, not
              standards               standards                               misinformation*                 misinformation

            Removed from            Remains on                                 Down-ranked                     Remains on
                                                                               and marked                       Facebook
              Facebook               Facebook
                                                                                 as false

           Potential appeal                                                  Potential appeal

                                                                           *Misinformation is removed if it creates a threat of
                                                                            physical harm or interferes with voting or the Census.

                                          WHO MODERATES THE SOCIAL MEDIA GIANTS? A CALL TO END OUTSOURCING                           1
(Executive Summary continued)
Today, 15,000 workers, the overwhelming
majority of them employed by third-party
vendors, police Facebook’s main plat-
form and its Instagram subsidiary. About       Summaryof
                                               Summary   ofour
                                                           ourRecommendations
                                                               Recommendations
10,000 people scrutinize YouTube and
other Google products. Twitter, a much
                                               to Social Media Companies
smaller company, has about 1,500                1    End outsourcing of content moderators and raise their station
                                                     in the workplace. Facebook—and YouTube and Twitter—should
moderators. These numbers may sound
substantial, but given the daily volume              gradually bring on board, with suitable salaries and benefits, a
of what is disseminated on these sites,              significant staff of content moderators drawn from the existing corps
they’re grossly inadequate.                          of outsourced workers and others who want to compete for these
                                                     improved jobs.
The enormous scale of the largest plat-
forms explains why content moderation
requires much more human effort to be
done right. Every day, billions of posts and    2    Double the number of moderators to improve the quality of
                                                     content review. Members of an expanded moderator workforce
uploads appear on Facebook, Twitter, and             would have more time to consider difficult content decisions while
YouTube. On Facebook alone, more than                still making these calls promptly.
three million items are reported on a daily
basis by users and artificial intelligence
screening systems as potentially warrant-
ing removal. This degree of volume didn’t       3    Hire a content overseer. To streamline and centralize oversight,
                                                     Facebook—and the other platforms—each should appoint a senior
happen by accident; it stems directly from
Facebook’s business strategy of relent-              official to supervise the policies and execution of content moderation.
lessly pursuing user growth in an effort to
please investors and the advertisers that
are the company’s paying customers.             4    Expand moderation in at-risk countries in Asia, Africa, and
                                                     elsewhere. The citizens of these nations deserve sufficient teams
Focusing primarily on Facebook as a                  of moderators who know local languages and cultures—and are
case study, this report begins with an               full-time employees of the social media companies.
Introduction and overview, followed by
a sidebar on page 6 about the interplay
between the coronavirus pandemic and
content moderation. Part 2 describes            5    Provide all moderators with top-quality, on-site medical care.
                                                     At the center of this improved medical care should be the question
the origin and early development of                  of whether a given employee is capable of continuing to moderate
moderation. Infographics providing
                                                     the most disturbing content.
a statistical breakdown of content
moderation by Facebook, YouTube,
and Twitter appear on pages 10 and 11.

Part 3 examines problems with Face-
                                                6    Sponsor research into the health risks of content moderation.
                                                     While the danger of post-traumatic stress disorder and related
book’s content moderation, with an                   conditions seems obvious, the social media companies still lack a
emphasis on the lack of adequate health              sufficient understanding of the precise risks their moderators face.
care for the people who do it and the                High-quality academic research is needed to address this gap.
generally chaotic environment in which
they work. This section of the report
draws heavily on interviews with former
moderators who describe meager                  7    Explore narrowly tailored government regulation. One interesting
                                                     idea comes from Facebook itself. The company suggests government
“wellness” programs in workplaces                    oversight of the “prevalence” of harmful content, which it defines as
characterized by a surprising degree                 the frequency with which deleterious material is viewed, even after
of contentiousness.
                                                     moderators have tried to weed it out.
Part 4 looks at the lack of adequate
moderation in at-risk countries in
regions such as South Asia. A sidebar
on fact-checking, a variant of content
                                                8    Significantly expand fact-checking to debunk mis- and
                                                     disinformation. Disproving conspiracy theories, hoaxes, and
moderation, appears on page 23. And                  politically motivated mis- and disinformation is a noble pursuit
finally, Part 5 offers recommendations               but one that’s now being done on too small a scale.
for improving the situation, which we
also provide here, in capsule form:

2      WHO MODERATES THE SOCIAL MEDIA GIANTS? A CALL TO END OUTSOURCING
1. Introduction

                     “
                              Picture what social media sites would look like without anyone removing
                              the most egregious content posted by users. In short order, Facebook,
                              Twitter, and YouTube (owned by Google) would be inundated not just by spam,
    ‘Content moderators       but by personal bullying, neo-Nazi screeds, terrorist beheadings, and child
   are the people literally   sexual abuse. Witnessing this mayhem, most users would flee, advertisers right
                              behind them. The mainstream social media business would grind to a halt.
     holding this platform
       together. They are     Content moderation—deciding what               back-office billing systems. Some of
                              stays online and what gets taken down—         these vendors operate in the U.S., others
    the ones keeping the      is an indispensable aspect of the social       in such places as the Philippines, India,
           platform safe.’    media industry. Along with the commu-          Ireland, Portugal, Spain, Germany, Latvia,
                              nication tools and user networks the           and Kenya. They hire relatively low-paid
   — A Facebook design        platforms provide, content moderation          labor to sit in front of computer worksta-
                              is one of the fundamental services             tions and sort acceptable content
   engineer participating     social media offers—perhaps the most           from unacceptable.
  on an internal company      fundamental. Without it, the industry’s
                              highly lucrative business model, which         The coronavirus pandemic has shed
    message board, 2019                                                      some rare light on these arrangements.

                     ”
                              involves selling advertisers access to the
                              attention of targeted groups of users,         As the health crisis intensified in March
                              just wouldn’t work.1                           2020, Facebook, YouTube, and Twitter
                                                                             confronted a logistical problem: Like
                              “Content moderators are the people             millions of other workers, content
                              literally holding this platform together,”     moderators were sent home to limit
                              a Facebook design engineer reportedly          exposure to the virus. But the platforms
                              said on an internal company message            feared that allowing content review to
                              board during a discussion of moderator         be done remotely from moderators’
                              grievances in early 2019. “They are the        homes could lead to security and privacy
                              ones keeping the platform safe. They are       breaches. So the social media companies
                              the people Zuck [founder and CEO Mark          decided temporarily to sideline their
                              Zuckerberg] keeps mentioning publicly          human moderators and rely more heavily
                              when we talk about hiring thousands of         on automated screening systems to
                              people to protect the platform.”2              identify and remove harmful content.
                                                                             In normal times, these systems, powered
                              And yet, the social media companies            by artificial intelligence (AI), identify and,
                              have made the striking decision to             in some cases, even eliminate, certain
                              marginalize the people who do content          disfavored categories of content, such
                              moderation, outsourcing the vast major-        as spam and nudity. Other categories,
                              ity of this critical function to third-party   including hate speech and harassment,
                              vendors—the kind of companies that             typically still require human discernment
                              run customer-service call centers and          of context and nuance.

                                             WHO MODERATES THE SOCIAL MEDIA GIANTS? A CALL TO END OUTSOURCING           3
In unusual public concessions, Facebook,     “Why do we contract out work that’s              A central question raised by outsourcing,
YouTube, and Twitter acknowledged that       obviously vital to the health of this            in whatever industry it occurs, is whether
depending more on automation would           company and the products we build?”              it leads to worker exploitation. In social
come at a price: Used on their own,          a Facebook product manager asked                 media, there’s an additional concern
the AI systems are prone to removing         during the same early-2019 in-house              about whether outsourcing jeopardizes
too much content. The algorithms             message board exchange about                     optimal performance of a critical function.
“can sometimes lack the context that our     moderators’ discontent.4
teams [of human moderators] bring, and                                                        Each of the social media platforms began
this may result in us making mistakes,”      ‘Plausible Deniability’                          to do content moderation on a limited
Twitter said.3 These admissions under-                                                        basis soon after the start of its operations—
                                             According to Sarah Roberts, a pioneering
scored the continuing fallibility of auto-                                                    Facebook in 2004, YouTube in 2005,
                                             scholar of content moderation, the social
mated screening and the corresponding                                                         and Twitter in 2006. Improvising at first,
                                             media companies handle the activity in
value of human review. As of this writing,                                                    employees fielded user complaints about
                                             a fashion that diminishes its importance
outsourced human moderation was just                                                          offensive posts or comments the way
                                             and obscures how it works. “It’s a way
beginning to resume, as some Facebook                                                         they might deal with forgotten passwords.
                                             to achieve plausible deniability,” Roberts,
reviewers returned to their offices on a                                                      The process has evolved since then, as
                                             an information studies expert at the
voluntary basis and others were allowed                                                       the platforms have adopted elaborate
                                             University of California, Los Angeles,
to do certain work from home. (For more                                                       sets of standards to guide moderation
                                             says in an interview. “It’s a mission-critical
on the coronavirus and moderation,                                                            and built automated screening systems
                                             function, but you fulfill it with your most
please see the sidebar on page 6.)                                                            relying on AI. Moderation has become a
                                             precarious employees, who technically
                                                                                              hybrid of algorithmic and human analysis
                                             aren’t even your employees.”5
In more ordinary times, three other                                                           in which live reviewers assess huge
problems arise from the outsourcing                                                           volumes of potentially problematic posts
                                             Beyond the distancing effect identified
of content moderation. First, in some                                                         spotted by users or AI technology.
                                             by Professor Roberts, outsourcing
parts of the world distant from Silicon
                                             saves social media companies significant
Valley, the marginalization of moderation                                                     Today, 15,000 workers, the overwhelming
                                             amounts of money on moderation, just
has led to the social media companies                                                         majority of them employed by third-party
                                             as it lowers costs for janitorial, food, and
paying inadequate attention to how their                                                      vendors, police Facebook’s main platform
                                             security services. Contract moderators
platforms have been misused to stoke                                                          and its Instagram subsidiary. About 10,000
                                             don’t enjoy the generous pay scales and
ethnic and religious violence. This has                                                       people scrutinize YouTube and other
                                             benefits characteristic of Silicon Valley.
occurred in places such as Myanmar                                                            Google products. Twitter, a much smaller
                                             Outsourcing also has given the tech
and Ethiopia. Facebook, for example,                                                          company, has about 1,500 moderators.
                                             companies greater flexibility to hire
has expanded into far-flung markets,                                                          These numbers may sound substantial,
                                             moderators in a hurry without having to
seeking to boost its user-growth                                                              but they’re woefully inadequate, Jennifer
                                             worry about laying them off if demand
numbers, without having sufficient                                                            Grygiel, a social media scholar at Syracuse
                                             for their services wanes.
moderators in place who understand                                                            University, says in an interview. “To get
local languages and cultures. Second,                                                         safer social media, you need a lot more
                                             Similar factors have motivated outsourcing
the peripheral status of moderators                                                           people doing moderation.”6
                                             by Western corporations in other
undercuts their receiving adequate
                                             contexts, ranging from customer-help
counseling and medical care for the
                                             centers in the Philippines to digital            A Problem of Scale
psychological side effects of repeated
                                             device factories in China and clothing-          The enormous scale of the largest social
exposure to toxic online content.
                                             production facilities in Bangladesh.             media platforms explains why content
Watching the worst social media has
                                             Outsourcing, to be sure, is not inherently       moderation requires additional human
to offer leaves many moderators
                                             detrimental. Bangladeshi women stitching         effort. Every day, billions of posts and
emotionally debilitated. Too often,
                                             t-shirts and trousers for sale in the U.S.       uploads appear on Facebook, Twitter, and
they don’t get the support or benefits
                                             and Europe need employment and may               YouTube. On Facebook alone, more than
they need and deserve. And third,
                                             not have better options. If workers are          three million items are reported on a daily
the frequently chaotic outsourced
                                             paid fairly and treated humanely,                basis by AI systems and users as potentially
environments in which moderators
                                             outsourcing can represent a salutary             warranting removal.7 This degree of volume
work impinge on their decision-making.
                                             aspect of the globalized economy.                didn’t happen by accident; it stems directly
Despite the crucial role they perform,       Likewise, social media moderators may            from Facebook’s business strategy of relent-
content moderators are treated, at best,     take the job eagerly, however modest the         lessly pursuing user growth in an effort to
as second-class citizens. Some full-         wage or harrowing the subject matter.            please investors and the advertisers that
time employees recognize the anomaly.                                                         are the company’s paying customers.

4     WHO MODERATES THE SOCIAL MEDIA GIANTS? A CALL TO END OUTSOURCING
A moderation workload of this heft,            content falls into a forbidden category      Why Focus on Facebook?
combined with the complexity of some           like hate speech, but instead whether
of the decisions, naturally leads to errors.   content is true or false. Facebook           All three of the major social media
Zuckerberg conceded in a November              outsources fact-checking but in a            platforms—as well as many others—
2018 white paper that moderators               different way from how it handles            undertake content moderation.
“make the wrong call in more than one          content moderation and in a fashion          The following pages, however, focus
out of every 10 cases.” He didn’t specify      that we support. The company hires           primarily on Facebook. There are three
how many erroneous calls that equates          journalism organizations and specialty       reasons for a case study approach:
to, but if Facebook moderators review          fact-checking websites to determine          First, Facebook, headquartered in Menlo
three million posts a day, Zuckerberg’s        whether content flagged by users or AI       Park, Calif., deserves close scrutiny
10% error rate implies 300,000 blunders        is accurate. When it’s identified, false     because it’s the largest competitor in its
every 24 hours. To his credit, the CEO         content typically is not removed.            segment of the industry and has served
admitted that given the size of Facebook’s     Facebook labels and down-ranks it in         as a trend-setter in content moderation.
user base—now some 2.5 billion people          users’ News Feed so that fewer people        Second, putting one company under the
—“even if we were able to reduce errors        see it. Scale becomes an issue for           microscope allows for a more detailed
to one in 100, that would still be a very      fact-checking, as it does for content        look at corporate practices. From this
large number of mistakes.”8                    moderation. Facebook sends flagged           examination, lessons can be developed
                                               content to more than 60 fact-checking        and applied to other companies, as
The gargantuan volume of online                organizations worldwide, but each            well. And third, Facebook was more
material creates other problems, as well,      organization typically assigns only          forthcoming than its rivals YouTube
including the difficulty of devising general   a handful of reporters to investigate        and Twitter. That we have rewarded
moderation rules that encompass the            Facebook posts. The number of poten-         this responsiveness with more in-depth
circumstances of billions of daily posts       tially false Facebook items far exceeds      attention may strike some people at
and uploads. In a landmark episode             fact-checkers’ capacity, meaning             Facebook as ironic, if not downright
from 2016, Facebook removed a frontal          that the system is overwhelmed and           irritating. We hope that upon reading
photo image of a naked nine-year-old           ultimately inadequate to the task. Like      the report, they will find it tough-minded
Vietnamese girl running in terror from         content moderation, fact-checking            but fair.
an explosion. A straightforward violation      demands more resources. (For more
of the platform’s rules against nudity         on fact-checking, please see the             Facebook’s openness, we should
and child exploitation? Not necessarily.       sidebar on page 23 and our recom-            emphasize, went only so far. Like
The 1972 photo, informally known as            mendations on page 24.)                      YouTube and Twitter, Facebook turned
“Napalm Girl,” had won a Pulitzer Prize                                                     down our repeated requests to visit
and remains an iconic representation           In an era when much of our political and     one or more of its moderation sites.
of the Vietnam War. After a public             cultural expression takes place online,      This denied us access to current
outcry, Facebook reversed itself,              content moderation and fact-checking,        reviewers and obliged us to seek out
restored the picture, and created an           while little understood by the broader       individuals who formerly did moderation
exception for otherwise offensive              public, play an important role helping       work. More broadly, Facebook declined
content that’s “newsworthy.”9                  to shape democracy and society at            to answer a number of our questions,
                                               large. Social media companies could          including basic ones such as what
The “Napalm Girl” incident illustrates         improve their performance by bringing        percentage of its moderator workforce
that content moderation will never             content review closer to the core of their   it outsources. Still, we would like to think
achieve perfect results or please              corporate activities, greatly increasing     that Facebook’s greater communicative-
everyone. Some harmful content will            the number of human moderators (even         ness overall indicates a willingness to
persist on mainstream platforms, and           while continuing to refine AI screening      consider our recommendations—and
moderators will sometimes censor items         software), and elevating moderators’         serve as an example to other platform
unwisely. But these realities should           status to match the significance of their    companies, which bear the same
not become an argument against incre-          work. On the fact-checking front, Face-      responsibility as Facebook to improve
mental improvement, which this report          book ought to use its ample resources        how they do content moderation.
urges in the recommendations that              to expand capacity as well, while Twitter
begin on page 24.                              and YouTube, however belatedly, need
                                               to follow Facebook’s example. Given
Facebook does a variation of content           the stakes for social media users,
review called third-party fact-checking,       and everyone else affected by what
which also deserves consideration.             happens online, the companies have an
Fact-checking evaluates not whether            obligation to take swift, dramatic action.

                                                             WHO MODERATES THE SOCIAL MEDIA GIANTS? A CALL TO END OUTSOURCING          5
The Coronavirus Pandemic and Content Moderation
    The coronavirus pandemic has shaken the global                                      In his session with journalists, Zuckerberg conceded
    economy to its foundation, causing factory closures,                                that the pandemic-induced reliance on technology
    transportation shut-downs, and mass layoffs. A more                                 would lead to more mistakes. The company’s algo-
    modest effect in the social media industry concerned                                rithms inevitably would “take down some content that
    content moderation. In the name of social distancing,                               was not supposed to be taken down,” he said. This
    thousands of reviewers were sent home. But as noted                                 prediction soon proved correct. In one illustration,
    in the main text of this report, Facebook, YouTube,                                 Facebook’s automated system—calibrated to ban virus
    and Twitter didn’t want content review to take place                                profiteering—mistakenly flagged posts by volunteers
    remotely for fear of potential software-security                                    making protective masks for doctors and nurses.4
    breaches or user-privacy violations. All three social                               YouTube and Twitter made similar statements warning
    media companies announced in mid-March that they                                    of overly aggressive algorithms. All of these unusual
    would temporarily reduce their reliance on human                                    concessions about the shortcomings of technology
    moderation and shift more of the content-review                                     strongly imply a continuing need for human involvement
    burden to their AI-driven technology.1                                              in content moderation.

    Facebook went a step further. In light of the stay-at-                              Another lesson from the pandemic is that more of this
    home edicts affecting many of the company’s out-                                    human involvement should come from people who
    sourced moderators, Mark Zuckerberg explained                                       are full-time Facebook employees—like the full-timers
    during a March 18, 2020, teleconference with reporters                              given responsibility for reviewing sensitive content
    that he had decided to enlist some of the company’s                                 during the coronavirus emergency. Facebook should
    full-time employees to handle the review of “the most                               use its response to the public health calamity as a pilot
    sensitive types of content.” He mentioned content                                   project to assess the feasibility of making all content
    related to suicide and self-harm, child exploitation,                               moderators Facebook employees. The sense of trust
    and terrorist propaganda. As with the shift to more                                 that Zuckerberg has in his own people—signaled by
    reliance on AI, Zuckerberg wasn’t specific about how                                his turning to them during a time of national crisis—
    long the hand-off of responsibility for sensitive content                           suggests an opening for discussion within Facebook’s
    would last. Facebook would stick with the rearrange-                                senior ranks.
    ment, he said, “for the time being.”2
                                                                                        Central to our recommendations, which begin on page
    By early May, a small number of Facebook moderators                                 24, is the idea that, quite apart from temporary pandemic
    were beginning to return to their workstations at offices                           work-arounds, Facebook and its rival platforms need
    run by third-party vendors. As the grip of the pandemic                             to end the outsourcing of responsibility for content
    loosens, all three of the major social media companies                              review. Doing so would address the trio of dangers that
    are expected to reestablish the outsourcing structure                               outsourcing creates: first, that at-risk countries receive
    that existed before the health crisis. Part of the reason                           insufficient attention from moderators; second, that
    for this is that AI can’t get the job done on its own.3                             moderators’ mental health is not adequately protected;
                                                                                        and third, that the outsourced environment is not
                                                                                        conducive to the sort of careful content assessment
                                                                                        that’s vital for user safety.

    1 Elizabeth Dwoskin and Nitasha Tiku, “Facebook Sent Home Thousands of Human Moderators Due to the Coronavirus. Now the Algorithms Are in Charge,”
     The Washington Post, March 23, 2020, https://www.washingtonpost.com/technology/2020/03/23/facebook-moderators-coronavirus/.
    2 Mark Zuckerberg, “Media Call,” Facebook, March 18, 2020, https://about.fb.com/wp-content/uploads/2020/03/March-18-2020-Press-Call-Transcript.pdf.

    3 “Coronavirus: Facebook Reopens Some Moderation Centers,” BBC, April 30, 2020, https://www.bbc.com/news/technology-52491123.

    4	Mike Isaac, “In a Crackdown on Scams, Facebook Also Hampers Volunteer Efforts,” The New York Times, April 6, 2020, https://www.nytimes.com/2020/04/05/
     technology/coronavirus-facebook-masks.html.

6   WHO MODERATES THE SOCIAL MEDIA GIANTS? A CALL TO END OUTSOURCING
2. T
    he Origins and Development
   of Content Moderation

                      “
                               Begun haphazardly and developed on the fly, content moderation initially was
                               intended to insulate users from pornography and intolerance. This aim has
                               persisted, even as moderation also became a shield with which social media
   ‘We were supposed to        platforms have sought to fend off controversy and negative publicity, says
   delete things like Hitler   Professor Roberts of UCLA. “It’s the ultimate brand-protection scheme.
                               It’s brand protection in the eyes of users and in the eyes of advertisers.”
       and naked people,’
    recalls Dave Willner, a    When Dave Willner arrived at Facebook          fear of legal repercussions, at least
                               in 2008, not long after earning his under-     within the U.S. This was thanks to a
  member of Facebook’s         graduate degree in anthropology from           federal provision known as Section 230
                               Bowdoin College, content moderation            of the Communications Decency Act
       pioneering content                                                     of 1996. An extraordinary boon to
                               was still a modest affair, performed
    moderation group. He       in-house. It had its roots in the early        online commerce, the law shields
                               years of the internet, when volunteers         internet platforms from liability for
 and his colleagues were       helped oversee chat rooms by reporting         most content posted by users. This
  told to remove material      “offensive” content. During his second         protection applies even if platforms
                               year at Facebook, Willner joined a team        actively moderate user-generated
 ‘that made you feel bad       of 12 people who followed a list of            content. According to St. John’s
         in your stomach.’     moderation rules contained on just a           University legal scholar Kate Klonick,

                      ”
                               single page. “We were supposed to              “the existence of Section 230 and its
                               delete things like Hitler and naked people,”   interpretation by courts have been
                               he recalls in an interview. More generally,    essential to the development of the
                               they removed content “that made you            internet as we know it today.”10
                               feel bad in your stomach.” Even though
                               Facebook already had about 100 million         Seeking Simplicity
                               users, the dozen in-house moderators
                               didn’t feel overwhelmed, he says. With         Empowered by Section 230, Willner
                               a primarily American clientele, Facebook       took the lead in replacing Facebook’s
                               “was still something mostly for college        one page of moderation rules with a
                               students and recent graduates, and             more fully developed set of guidelines.
                               most of us were recent graduates,              The 15,000-word document he eventu-
                               so for the most part, we understood            ally produced remains the basis of the
                               what we were looking at.” Early in their       company’s publicly available Community
                               corporate lives, YouTube and Twitter           Standards, which have been amended
                               gave their moderators similarly bare-          many times over the years. The stan-
                               bones instructions of what to remove.          dards favor free speech when possible,
                                                                              Willner says. But their overarching goal
                               Willner and others involved in what might      is to provide moderators with simple,
                               be called the artisanal phase of content       concrete rules that can be applied
                               moderation could do their jobs without         consistently by nonlawyers. This aim

                                             WHO MODERATES THE SOCIAL MEDIA GIANTS? A CALL TO END OUTSOURCING          7
became increasingly important as Face-        “jokes” included the punch line, “Tape         of successful moderation,’” observes Del
book expanded, and content moderation         her and rape her,” written over a photo        Harvey, the platform’s vice president for
expanded along with it.                       of a woman with heavy white tape               trust and safety.
                                              covering her mouth. Feminists expressed
By 2010, explosive growth at Facebook         outrage, and certain large corporations        Tom Phillips, a former executive at Google
made it impossible for its in-house con-      threatened to pull their advertising from      who left that company in 2009, makes a
tent-review team to handle the increased      the platform. After initially insisting that   related point. Moderation has never been
volume of user reports about spam,            the misogynistic material didn’t violate its   fully accepted into Silicon Valley’s vaunted
pornography, hatred, and violence. The        hate speech rules, Facebook reversed           engineering-and-marketing culture, he
social network needed more moderators.        itself, announced that the rape jokes did      says. “There’s no place in that culture for
“There wasn’t much debate about what          breach its standards after all, and took       content moderation. It’s just too nitty-gritty.”
to do, because it seemed obvious:             them down.11
We needed to move this to outsourcing,”                                                      Content moderation presents truly
Willner says. “It was strictly a business-    The clash over rape jokes illustrated          difficult challenges. Before the 2000s,
ops decision,” based on cost concerns         Facebook’s struggle to balance free            corporations hadn’t confronted a task
and the greater flexibility outsourcing       expression against freedom from hatred         quite like it. But the difficulty stems directly
offered. By 2013, when Willner left           and cruelty. Twitter leaned more decisively    from the business models chosen by
the company, he says, Facebook had            toward allowing users to speak freely.         Facebook, YouTube, Twitter, and some of
more than a billion users and about           Unfortunately, this strategy made Twitter      their smaller rivals. These models empha-
1,000 moderators, most of them now            a notorious destination for trolls and hate    size an unremitting drive to add users and
outsourced. This produced a ratio of          groups, one where instances of harass-         demonstrate growth to investors. More
one moderator for every million users.        ment were, and to some extent still are,       users attract more revenue-generating
                                              common.12 In 2015, Guardian columnist          advertising, but they also produce more
Content moderation was outsourced in          Lindy West recounted her years-long            content to moderate and more permuta-
another sense, as well. Facebook relied       experience with bullies who tweeted vile       tions of meaning, context, and nuance—
heavily on users, acting without compen-      taunts at her, such as, “No one would          all of which invite error.
sation, to report potentially offensive or    want to rape that fat, disgusting mess.”
dangerous content to the company.             In response, Twitter’s then-CEO, Dick          A CEO’s Concession
This reliance on what amounted to an          Costolo, wrote a scathing internal memo,
enormous volunteer corps of harmful                                                          In the wake of the 2016 presidential
                                              which promptly leaked. “We suck at
content scouts meshed with yet another                                                       election debacle, in which Russian
                                              dealing with abuse and trolls on the
aspect of the social media business                                                          operatives used Facebook, Instagram,
                                              platform,” Costolo wrote, “and we’ve
model: the expectation that users would                                                      and Twitter to spread disinformation,
                                              sucked at it for years.”13
supply most of the material—ranging                                                          Mark Zuckerberg was on the defensive.
from puppy pictures to political punditry     Despite the challenges it presented for        Facebook wasn’t keeping up with the
—that draws people to social media.           all of the platforms, content moderation       content challenges it faced, he conceded
These several forms of outsourcing            continued to expand rapidly. By 2016,          in a February 2017 public essay: “In the
have combined to help Facebook keep           Facebook had 1.7 billion users and             last year, the complexity of the issues
its full-time employee headcount—now          4,500 moderators, most of them                 we’ve seen has outstripped our existing
at 45,000—considerably lower than it          employed by outside contractors.               processes for governing the community.”
otherwise would be, while raising its         The ratio of moderators to users was           Moderators, he continued, were “misclassi-
enviable profit margins. Moreover, while      much improved from three years earlier         fying hate speech in political debates in both
the move to lower-cost outsourcing of         but still stood at one to 377,777. Today,      directions—taking down accounts and
content moderation might seem to              the ratio is one to 160,000, and the           content that should be left up and leaving
some within Facebook as having been           company uses far more automation               up content that was hateful and should be
inevitable, it was, in fact, a purposeful     to complement human reviewers.                 taken down.” He pointed to the precipitous
choice, driven by financial and logistical                                                   removal of “newsworthy videos related to
priorities. Over time, this choice would      Two key characteristics have reinforced        Black Lives Matter and police violence”—
have distinct consequences.                   the marginalization of the function. First,    content that often included raw language
                                              it’s a source of almost exclusively bad        about race, but was uploaded
As Facebook grew, disturbing content          news: Tech journalists and the public          in the spirit of combating racism. In many
on the site proliferated. In 2013, a public   typically focus on content moderation          instances, the CEO added, the compa-
controversy flared when Facebook              when it fails or sparks contention, not        ny’s reliance on users to report troubling
moderators failed to remove content           on the countless occasions when it             content simply wasn’t working. He tried
from groups and pages featuring               works properly. “No one says, ‘Let’s           to deflect the blame for this. “We review
supposedly humorous references to             write a lengthy story on all of the things     content once it is reported to us,” he wrote.
sexual assaults on women. One of the          that didn’t happen on Twitter because          “There have been terribly tragic events—
                                                                                             like suicides, some live streamed—

8      WHO MODERATES THE SOCIAL MEDIA GIANTS? A CALL TO END OUTSOURCING
that perhaps could have been prevented
if someone had realized what was
happening and reported them sooner.            Ad Hoc Policies: From COVID-19 to Holocaust Denial
There are cases of bullying and harass-
ment every day that our team must be           Content moderation is a large and complicated topic. This report covers
alerted to before we can help out.”14          moderation problems related to outsourcing but not to other dimensions of the
                                               subject. For example, it does not delve into questions surrounding high-level
Zuckerberg’s suggestion that users bear
                                               policy decisions about moderation made by senior executives. One illustration
primary responsibility for policing Face-
                                               of such a determination was Facebook’s laudable decision beginning in the
book obscures that he and his business
                                               winter of 2020 to act more aggressively than usual to remove dangerous
colleagues designed the system, flaws
                                               misinformation related to the COVID-19 pandemic: false cures, conspiracy
and all, and failed to anticipate how much
                                               theories, and the like. Another high-level policy call of a very different sort
harmful content Facebook would attract.
                                               concerns posts that deny that the Holocaust took place. Facebook continues,
Three months later, Zuckerberg an-             unwisely, to allow content that promotes the idea that the Holocaust never
nounced that he would increase the             occurred, despite the fact that such content represents rank anti-Semitism.1
number of content moderators by two-
thirds, to 7,500. When describing such         These policies, for better or worse, help determine the kind of material available
expansions, the CEO and other Facebook         to users on Facebook. But the decisions behind them presumably would be
executives generally haven’t mentioned         made regardless of whether content moderators work on an outsourced basis.
that the majority of moderators are out-       An analogous decision in the fact-checking realm was Mark Zuckerberg’s
sourced workers, not full-time company         determination in the fall of 2019 that Facebook, in the name of free speech,
employees. The 3,000 new moderator             would not review political advertising for untrue statements. Again, this was
hires in 2017 followed public outcry over      a weighty judgment call—one that unfortunately extended a virtual invitation
the posting of violent videos. One showed      to politicians and their campaigns to lie—but not one that bears on how
the fatal shooting of a 74-year-old retiree    Facebook structures its relationship with fact-checkers.2
in Cleveland; another, a live stream,
depicted a man in Thailand killing his         Facebook has a serious and systematic process for routinely amending its
11-month-old daughter. It took Facebook        Community Standards. But too often, high-level content policy decisions
moderators more than two hours to              seem ad hoc and reactive. The company’s decision-making about white
remove the Cleveland video. The footage        nationalism and white separatism illuminates the problem. Before March 2019,
from Thailand stayed up for about 24           Facebook had prohibited hateful attacks on people based on characteristics
hours and was viewed roughly 370,000           such as race and ethnicity. This prohibition included expressions of white
times. “If we’re going to build a safe         supremacy. But the platform distinguished between white supremacy, on the
community, we need to respond quickly,”        one hand, and white nationalism and separatism, on the other. The distinction
Zuckerberg wrote in a post.15                  was based on the dubious notion that the latter could be bound up with
                                               legitimate aspects of people’s identity.3
In December 2017, ProPublica took a
revealing look at moderation. The non-         In 2018, the tech news site Motherboard published internal documents
profit investigative journalism organization   used to train Facebook content moderators. The documents showed that
solicited from its readers instances where     Facebook allowed “praise, support, and representation” of white nationalism
they believed Facebook had erred in            and separatism “as an ideology.” This sparked a new round of criticism from
applying its own standards. Of the more        civil rights advocates, who had long contended that white nationalism and
than 900 posts submitted, ProPublica           separatism stood for the same repugnant ideas as white supremacy. Under
asked Facebook to explain a sample             pressure from these advocates, Facebook changed its position, ultimately
of 49 items, most of which involved            reaching the right result, although belatedly and in a roundabout manner.4
leaving up material that ProPublica and
its readers perceived as hate speech.           1 E
                                                     zra Klein, “The Controversy Over Mark Zuckerberg’s Comments on Holocaust Denial, Explained,”
Facebook acknowledged that in 22 cases,             Vox, July 20, 2018, https://www.vox.com/explainers/2018/7/20/17590694/mark-zuckerberg-facebook-
                                                    holocaust-denial-recode; Jonathan A. Greenblatt, “Facebook Should Ban Holocaust Denial to Mark 75th
its moderators had made mistakes.                   Anniversary of Auschwitz Liberation,” USA Today, January 26, 2020, https://www.usatoday.com/story/
Even for a sample selected in this non-             opinion/2020/01/26/auschwitz-liberation-ban-holocaust-denial-on-facebook-column/4555483002/.

representative fashion, the 45% error          2
                                                “Facebook’s Zuckerberg Grilled Over Ad Fact-Checking Policy,” BBC, October 24, 2019, https://www.bbc.
                                                 com/news/technology-50152062.
rate was remarkable. “We’re sorry for
                                                3 Tony Romm and Elizabeth Dwoskin, “Facebook Says It Will Now Block White-Nationalist, White-Separatist
the mistakes we have made—they do                Posts,” The Washington Post, March 27, 2019, https://www.washingtonpost.com/technology/2019/03/27/
not reflect the community we want to             facebook-says-it-will-now-block-white-nationalist-white-separatist-posts/.

build,” Facebook said in a statement            4 Joseph
                                                           Cox, “Leaked Documents Show Facebook’s Post-Charlottesville Reckoning With American
                                                   Nazis,” Motherboard, May 25, 2018, https://www.vice.com/en_us/article/mbkbbq/facebook-charlottes-
at the time. “We must do better.”16                ville-leaked-documents-american-nazis.

(The main text continues on page 12.)

                                                             WHO MODERATES THE SOCIAL MEDIA GIANTS? A CALL TO END OUTSOURCING                                9
Moderation by the numbers: removing harmful content

Beyond spam and fake accounts, Facebook contends with adult nudity, violence,
and dangerous organizations...

     Facebook content removed or covered*                                               Heavy reliance on artificial intelligence
     These figures are for the first quarter of 2020,                                   Percentage of content removed or covered that was
     the most recent available data.                                                    flagged by Facebook AI technology before any users
                                                                                        reported it (first quarter of 2020).
       107.5
       million                                                                             Bullying and harassment     15.6%

                                                   Spam (pieces of content)                           Hate speech                                     88.8%
                                                   Fake accounts
                                                                                             Suicide and self-injury                                    97.7%
         1.7                       1.9             Other (nudity, violence,
        billion                   billion          hate speech, etc.)                   Violent and graphic content                                          99%
                                                                                            Child nudity and sexual
                                                                                            exploitation of children                                    99.5%

                                                                                                                       0       20   40       60         80         100

     *Facebook covers some nonviolating content and provides a warning that
      it may be disturbing.                                                             Prevalence of selected forms of
                                                                                        harmful content
     Facebook removals other than fake                                                  Prevalence measures how often harmful content slips
     accounts and spam*                                                                 past moderation efforts and remains available to users.
                                                                                        This chart estimates the upper limit of views per 10,000 of
     First quarter of 2020, in millions.                                                content that violated the community standard in question.*
                     1.7                           Adult nudity and sexual activity
               2.3                                                                                 Child nudity and                      5
                                                   Violent and graphic content                   sexual exploitation
                  8.6                              Dangerous organizations:                     Drugs and firearms                       5
                                                   terrorism and organized hate
           9.3                                                                               Suicide and self-injury                     5
                                  39.5             Hate speech
         9.6                                       Drugs and firearms                         Terrorist propaganda                       5
                                                   Child nudity and sexual                         Adult nudity and                          6
           11                                      exploitation of children                          sexual activity
                           25.5                    Bullying and harassment                              Violent and                                     8
                                                                                                    graphic content
                                                   Suicide and self-injury
                                                                                                                       0        2    4            6          8       10

      *Includes some content that is covered but not removed.                         *Facebook generates prevalence estimates based on its own sampling of content.

     Detailed rules govern content moderation
     Facebook provides highly specific guidelines for moderators to enforce. Here’s one example:

                                         Facebook bans “direct attacks” on people based on “protected characteristics,” such as race,
                                         ethnicity, national origin, religious affiliation, sexual orientation, gender, or disability. Direct attacks
     Hate speech                         can take the form of “violent speech,” “dehumanizing speech or imagery,” and derogatory
                                         comparisons to insects, animals perceived as intellectually or physically inferior, filth, bacteria,
                                         disease, or feces. Certain designated comparisons are also prohibited, including Blacks and
                                         apes, Jews and rats, Muslims and pigs, and Mexicans and worm-like creatures.

Source: Facebook

10       WHO MODERATES THE SOCIAL MEDIA GIANTS? A CALL TO END OUTSOURCING
...while YouTube takes down videos endangering children, and Twitter suspends accounts
for hateful conduct.

   YouTube videos removed                                                       Twitter accounts locked or suspended*
   These figures are for the fourth quarter of 2019, the                        These figures are for the first half of 2019, the most
   most recent available data.                                                  recent available data.

            3.1%                                                                                 2.4%
        5.2%                        5,887,021 videos                                3.5%                1.5%           1,254,226 accounts
                                    removed                                     4.5%
                                                                                                                       locked or suspended
         9.8%                         Spam, misleading or scams                                                            Hateful conduct
                                      Child safety                                 9.9%
                                                                                                                           Abuse
     14.1%                  52%       Nudity or sexual                                                     46.6%           Impersonation
                                      Violent or graphic                                                                   Violent threats
           15.8%                      Other                                            31.6%                               Sensitive media
                                      Harmful or dangerous                                                                 Child sexual exploitation

                                                                                                                           Private information

   YouTube comments removed                                                     *Twitter enforcement actions range from temporarily disabling accounts
   Fourth quarter of 2019                                                        to shutting them down altogether.

             0.1%
           0.1%                     540,195,730                                 Selected Twitter enforcement statistics
                                    comments removed                            First half of 2019
             7.9%
        8.3%                          Spam, misleading or scams

                                      Hateful or abusive                           50% of tweets that Twitter took action
                            58.9%     Harassment or cyberbullying                  on for abuse were proactively identified using
      24.7%                           Child safety                                 technology, rather than being reported by users.
                                      Harmful or dangerous                         This compares to 20% a year earlier.
                                      Other

                                                                                  105% more accounts overall were
                                                                                  locked or suspended for violating rules.
   YouTube channels removed
   Fourth quarter of 2019
                                                                                   119% more accounts were suspended
                      0.6%
            0.9%         0.6%       2,088,253                                      for violating private information rules.
        2.4%
 6.4%                               channels removed
                                      Spam, misleading or scams
                                                                                   133% more accounts were locked or
                                      Nudity or sexual                             suspended for hateful conduct.
                                      Child safety

                                      Other

                                      Harassment or cyberbullying                  30% fewer accounts were suspended
                    89.1%                                                          for promotion of terrorism.
                                      Promotion of violence or
                                      violent extremism

Source: YouTube                                                             Source: Twitter

                                                                    WHO MODERATES THE SOCIAL MEDIA GIANTS? A CALL TO END OUTSOURCING                      11
3. T
    he Moderator’s Experience

                               “
                                          Mark Zuckerberg once said that “in a lot of ways, Facebook is more like a
                                          government than a traditional company. We have this large community of
                                          people, and more than other technology companies we’re really setting policies.”17
                  ‘We still have
                   enforcement            If Facebook is like a government,                 To round out Zuckerberg’s government
                                          then Zuckerberg heads the executive               metaphor, Facebook has launched the
               problems. You’re           branch, or perhaps rules as monarch.              equivalent of a judicial branch. A nascent
              going to hear that          The legislature takes the form of a policy        semi-autonomous Oversight Board—
                                          team of more than 100 people working              informally referred to as the Supreme
           across the industry.’          under a company vice president named              Court of Facebook—will be populated by
                                          Monika Bickert. A former federal prose-           40 outsiders. Facebook named the first
             — Monika Bickert,            cutor, Bickert leads a rolling process of         20 in May 2020, and the list included
               Facebook’s vice            supplementing and amending Facebook’s             an impressive, diverse array of legal
                                          public Community Standards, as well               scholars, human rights experts, and
            president for global          as its internal guidelines interpreting           former public officials. Working in panels
           policy management              the standards. Content moderators,                of five, board members will review

                               ”
                                          in this scheme, are the police officers           selected user appeals from moderators’
                                          who enforce the standards. To do so,              decisions. Apart from specific cases,
                                          the cops on the beat rely on leads                the board will respond to requests for
                                          from two types of informants: human               policy guidance from Facebook. These
                                          users and, increasingly, inanimate AI             activities are designed to resolve particular
                                          flagging systems.                                 disputes and establish principles that
                                                                                            will guide the company and moderators
                                          The moderators don’t work for the Face-           in future cases. In a sense, the Oversight
                                          book government, however. They are rent-          Board represents another twist on the
                                          a-cops, employed by third-party vendors.          outsourcing theme, as Facebook seeks to
                                                                                            shift responsibility for certain moderation
                                          By all accounts, Bickert’s policy group
                                                                                            decisions to an independent body, albeit
                                          works with great earnestness to craft
                                                                                            one that it has created and financed.
                                          highly granular rules that she says are
                                          intended to remove as much discretion
                                          as possible from moderators. Specificity
                                                                                            From Porn to Violence
                                          “makes it possible for us to apply policies       Outsourced Facebook moderation takes
                                          at this unprecedented scale of billions of        place at more than 20 sites worldwide,
                                          posts every day,” she adds. But asked             although the company won’t confirm how
                                          about Zuckerberg’s estimate of a 10%              many countries host these sites. Third-
                                          error rate, she bluntly acknowledges:             party vendors—companies such as
                                          “We still have enforcement problems.              Accenture, Competence Call Center,
                                          You’re going to hear that across the industry.”   CPL Resources, Genpact, and Majorel—

12   WHO MODERATES THE SOCIAL MEDIA GIANTS? A CALL TO END OUTSOURCING
“
run these operations. India, Ireland, and     whether one image after another violated
the Philippines host major hubs, each         Facebook’s prohibition of “adult nudity
of which handles content automatically        and sexual activity,” as defined in the
routed to it from all over the globe. Other   Community Standards. Employees                 One moderator recalls
Facebook sites, such as those in Kenya        assigned to other queues assessed              videos showing men in black
and Latvia, focus primarily on content        content that had been flagged for hate
from their respective regions. When           speech, graphic violence, terrorist            balaclavas using machine
they are called upon to review content in     propaganda, and so forth. The mod-
languages they don’t know, moderators         erators’ options were to click “ignore,”
                                                                                             guns to mow down captives
use the company’s proprietary trans-          “delete,” or “disturbing,” the last of which   in orange jumpsuits, a
lation software. Facebook moderators          applied to items that didn’t violate the
collectively understand more than 50          standards but might upset some users.          woman wearing an abaya
languages, but the platform supports          Facebook covers disturbing content             being stoned to death, and
more than 100—a large gap, which the          with a warning that users have to click
company says it is working to close.          through if they want to see it.                an alleged sex offender in
As noted earlier, Facebook declined our                                                      Russia getting whipped to
repeated requests to visit a moderation       CPL “team leaders” set daily “game
site and talk to current workers.             plans” for the number of pieces of             death. Animal torture was
                                              content, or “tickets,” each moderator
Christopher Gray left a temporary gig         should process. The prescribed “average
                                                                                             common, including a video
teaching English in China to become           handling time” was 30 to 60 seconds per        showing dogs being cooked
a Facebook moderator with CPL                 ticket, Gray says. That translated to 600
                                                                                             alive in China.

                                                                                             ”
Resources in 2017 in Dublin, his              to 800 pieces of content over the course
adopted home city. His wife, who at           of a typical eight-hour shift, during which
the time worked for CPL in a non-             he would take time out for short breaks
moderator role, told him about the            and a meal. When the queue got backed
opening. CPL, which is based in Dublin        up, he says he would “blast through”           Gray’s experience wasn’t unusual.
and has operations across Europe,             1,000 items in a shift. Facebook says          In his first week working for CPL at
was expanding its work for Facebook.          that the company’s third-party contrac-        about the same time, Sean Burke
Gray, who is now 50, started with the         tors don’t enforce quotas and instead          remembers watching a Facebook
relatively modest hourly wage of €12.98,      encourage reviewers to take as much            video of a man being beaten to death
or about $14. CPL also paid a small           time as they need to evaluate content.         with a wooden board with nails sticking
bonus for working nights and a daily                                                         out of it. The next day, he encountered
travel allowance. An enthusiastic             Over the months, Gray’s circumstances          a bestiality video for the first time, and
Facebook user, Gray was impressed by          changed. He and his working group were         soon thereafter he started seeing child
the company’s glass office building in        moved to a drab CPL-run building in            pornography. “They never actually
Dublin’s bustling Docklands section.          Dublin, and he was switched from porn          teach you to process what you’re
“It has a large atrium and lots of light,     to a “high priority” queue containing a        seeing,” says Burke, now 31. “It’s not
artwork, green plants, free food, coffee      mixture of deeply troubling material           normal seeing people getting their
and tea,” he says in an interview. And        requiring immediate attention. He’d            heads cut off or children being raped.”
he began with a sense of mission.             shrugged off the pornography but now           Some of the worst images were
“You have this feeling that you’re there      was confronted by imagery he couldn’t          uploaded hundreds or even thousands
to do good, protect the users,” he says.      get out of his head, even during his off       of times, coming back over and over
“We’re making a contribution.”                hours. He recalls one video in which           to haunt moderators like Burke and
                                              men in black balaclavas used machine           Gray, who often had to remove multiple
Facebook says in a company statement          guns to mow down a group of captives           copies of the same content.
that moderators receive “extensive train-     in orange jumpsuits at point-blank
ing” that includes “on-boarding, hands-       range. In another, a woman wearing an          As part of his supervisory duties, Guy
on practice, and ongoing support and          abaya was stoned to death. In a third,         Rosen, Facebook’s vice president for
training.” Gray describes his training as     an alleged sex offender in Russia was          integrity, has immersed himself for hours
only eight days of “pretty cursory” Pow-      whipped to death. Animal torture was           at a time in raw content from moderation
erPoint displays presented in rote fashion    common, including a video showing              queues. Without reference to any spe-
by a CPL staff member. On his first day       dogs being cooked alive in China. He           cific reviewers, he acknowledges in an
of actual moderating, Gray went to work       marked all of this content for deletion,       interview that being a full-time moderator
on a global pornography “queue.” He sat       but watching it unfold on his screen           would be arduous. “It’s a hard job,” he
in front of a desktop monitor deciding        took a toll.                                   says. “It’s a really hard job.”

                                                            WHO MODERATES THE SOCIAL MEDIA GIANTS? A CALL TO END OUTSOURCING         13
You can also read