Grime You Can Never Wash Off: Internet Content Moderation and New Frontiers in Labor Exploitation

Scrolling through e-mails and my Facebook news feed one morning last week, I came across two related articles. The first, from Alternet, was about the disproportionate harassment and abuse that women face online. Citing a recent Atlantic exposé on the issue, as well as death threats made to feminist video game critic and “GamerGate” target Anita Sarkeesian, the article underscored the negligence of Facebook, YouTube, and other companies whose content moderators—those employed to flag and delete offensive materials coming across their sites—appeared indifferent to or, perhaps, poorly trained to address the increasing problem of Internet-based violence against women. These moderators, the article mentions, are often “swamped with cases.” But in a tech industry dominated by men at all levels of employment, whether or not a woman is subjected to terrifying forms of online abuse—including, in one case, a Facebook post featuring a woman’s head photoshopped onto a picture of a beaten and chained woman— comes down to “human decision-making” on the part of the people tasked with sifting through the digital garbage.

The second article, from Wired, offered a more detailed look at what Internet content moderation involves. I honestly hadn’t given any thought at all to content moderation as an especially filthy job that, even without the smelly trucks and beeping, is a form of garbage collection. In this case, though, the grime sticks to workers in a way that makes emptying trashcans and dumpsters sound like a dream job by comparison.

Internet content moderation is typical of other outsourced, global forms of labor in that the U.S. relies on poorly paid contract workers from the Philippines to do the vast majority of the work. However, since recognizing what would be offensive requires cross-cultural fluency, most companies have also implemented what Wired reporter Adrian Chen calls a “two-tiered moderation system, [where] more complex screening… is done domestically.” Far better paid than overseas workers—“a moderator for a U.S. tech company can make more in an hour than a veteran Filipino moderator makes in a day”—most U.S. based moderators are culled from the ranks of precariously employed college graduates, many of whom are enticed to take these jobs with suggestions that a more permanent position at Google or Twitter might be on the horizon. In general, however, not only do these better jobs never solidify, but content moderation’s status as labor of the living nightmare variety quickly becomes apparent to employees.

In The Managed Heart, sociologist Arlie Russell Hochschild begins her discussion of emotional labor, such as the work of flight attendants, care workers, and others in feminized service occupations, by asking whether there may be a fundamental “human cost of becoming an ‘instrument of labor’ at all” (3). This question illuminates the psychological costs faced by those whose jobs require “[inducing] or [suppressing] feeling in order to sustain the outward countenance” that makes consumers of such labor feel properly “cared for.” This “coordination of mind and feeling” can cause the worker to become alienated from an “aspect of self—either the body or the margins of the soul—that is used to do the work” (7).

But what if the work demands subjecting oneself to psychological trauma resulting from the continual repetition of horrifying images and sounds? What happens to the “margins of the soul” when a job requires workers to be used in this way?

Chen interviewed a number of former and current Internet content moderators who describe what they experienced on the job, and what they still carry with them. One U.S.-based moderator quit his job at Google when a co-worker exhibited a nonchalant response to a video of a beheading: “I didn’t want to look back and say I became so blasé to watching people have these really horrible things happen to them that I’m ironic or jokey about it.” Others, subjected to hours of pornography, report feeling desensitized to the point where they “no longer want to be with their spouses” or, on the other hand, leave work with “a supercharged sex drive.” Many companies ostensibly employ counselors to deal with the psychic fallout from this work, which puts laborers at risk of PTSD much like soldiers and members of specialized police forces, though one former worker claimed to not know anyone who had seen a counselor. “But,” Chen emphasizes, “even with the best counseling, staring into the heart of human darkness exacts a toll.” After being made to watch a nearly half-hour video of a woman being raped, “blindfolded, handcuffed, screaming and crying,” one Filipino woman content moderator “began to tremble with sadness and rage” (in Chen’s words). Says the woman, who is still doing content moderation work, “I watched that a long time ago, but it’s like I just watched it yesterday.”

As its own devastating aspect of the “heart of human darkness” run rampant on the Internet, online victimization of women is an urgent problem. Yet after reading Chen’s report, I can’t help but feel that the “human decision-making” involved in content moderation is compromised by the utterly dehumanizing nature of the work. The “aspect of self” that many content moderators become estranged from is their own humanity, unable to plug into and feel things they must figure out a way not to feel in order to simply bear the work.

This is not to say that in the male-dominated tech industry, sexism and misogyny aren’t also at play when moderators make that quick decision to either delete or push through abusive content aimed at women. But read in this context, Hochschild’s work provokes us to think about the ways that gender and psychic health intersect in an occupation that requires exposing oneself to trauma as a primary duty of the job. Counseling isn’t widely advertised or used, and a masculine “deal with it” ethos further contributes to the occupational normalization of violence in an industry that, as Chen puts it, “[relies] on an army of workers employed to soak up the worst of humanity in order to protect the rest of us.”

This last observation begs a version of Hochschild’s initial question: if the job of content moderator requires workers to absorb our collective human trauma in order to “protect the rest of us” from the ravages of the Internet, should a job like this exist at all? Should “must expose oneself to violence repeatedly, for days and weeks on end” be an accepted part of any job description? Chen estimates that content moderators “comprise as much as half of the total workforce for social media sites.” Indeed, moderation work is especially insidious in that, unlike labor more typically associated with trauma—sex work comes to mind—it is hidden within an industry stereotyped as the benign realm of particle-board cubicles and sleepy systems administrators.

When we walk down the street, we see waste management workers laboring to present us with a convincing façade of civilized cleanliness. The more thoughtful among us recognize this as the dangerous lie that it is: this waste is never really “disposed” of, only moved out of sight of the privileged. The existence of content moderation work demands that we consider the human costs of maintaining the web’s garbage-free front. If the Internet requires turning human workers into psychic dumpsters for brutalities the rest of us would rather not have cluttering our Facebook and Instagram feeds, then what kind of virtual world are we living in, grime and all?

Sara Appel

Sara Appel is a Dietrich School Postdoctoral Fellow in the English Department at the University of Pittsburgh.

 

This entry was posted in Class and the Media, Contributors, Guest Bloggers, Issues, Work and tagged , , , . Bookmark the permalink.

5 Responses to Grime You Can Never Wash Off: Internet Content Moderation and New Frontiers in Labor Exploitation

  1. JunkChuck says:

    Fascinating–the idea of industrial content moderation never occurred to me, because I’ve never thought to consider the scale of forums requiring moderation. I’m intrigued, and a bit curious in the way that I’ve felt standing at the precipice of a waterfall: what would happen if i just…dipped…a…toe….?

    Like

  2. knewman4 says:

    Thanks for your provocative (and depressing!) post. You connect a lot of things I have never seen connected before!

    Like

  3. The fact that, per the Wired story, the likes of Facebook outsources a lot of this work to developing nations speaks volumes.

    Like

  4. Marissa says:

    It is a sad indictment of capitalism that its greed and refusal to offer dignified and honest paying work begets the very filth in which it survives. Allowing the degradation of women, minorities, and other groups to go unencumbered permits discrimination, hatred, and ultimately divisiveness to run rampant within a world in such desperate need of love. Why? Simply because it s less cost effective to control hatred than to turn a blind eye to its carrying on. . . and that is a sad maxim that floods our economic system.

    Like

Leave a comment