New report casts light on the dark job of Facebook content moderators

0
666
The secret lives of Facebook content moderators

A new report from The Verge has cast light on the dark, demanding job of those who worked for a company contracted to perform content moderation for Facebook.

Because these moderators had access to the personal information of Facebook’s 2.3 billion monthly users (and out of fear of retaliation from disgruntled users banned from the platform), they signed non-disclosure agreements (NDAs) forbidding them from speaking about their jobs. However, when a dozen anonymous sources pulled back the veil of secrecy, The Verge exposed a closely-monitored workplace that left multiple former employees with deep psychological scars.

Moderate pay for Moderation

The Verge’s report focuses on Cognizant, a professional services vendor based in Phoenix, Arizona. According to the report, Cognizant moderators make an annual salary of just $28,800, a pittance compared to the $240,000 in annual compensation earned by the average Facebook employee.

Advertisement
Manage your supply chain from home with Sourcengine

In contrast to the lavish workspaces and freedom afforded to Facebook’s own employees, these moderators worked in a drab office park and had their workdays managed all the way down to bathroom breaks. According to The Verge, “contractors in Arizona labor in an often cramped space where long lines for the few available bathroom stalls can take up most of employees’ limited break time.” To protect user privacy, no items, even scraps of paper, are allowed at their workspaces unless they’re placed in a clear bag visible to managers.

Witnessing the Worst

Whatever the working conditions, there’s an inescapable ugliness to the job of reviewing content that has been reported for violating the community standards of Facebook. Although a Cognizant policy manager told The Verge that “most of the stuff we see is mild, very mild,” there will always be exceptions, some of them extreme.

One former employee suffered a panic attack after a graphic video of a man being stabbed to death was shown to a room full of trainees. Explicit sex, including bestiality, regularly appeared in the queue. Racism was even more common.

Conspiracy theories would be reviewed and may have had an effect on some employees beliefs. One auditor “walks the floor promoting the idea that the Earth is flat,” and another former employee told The Verge, “I no longer believe 9/11 was a terrorist attack.”

Contract High

As one might imagine from a job “that was darkening our soul[s],” as one former employee put it, moderators turned to a variety of coping mechanisms. Beyond everpresent gallows humor, one former Cognizant employee told The Verge he used a marijuana vaporizer at work “almost daily” and that many employees would use pot and alcohol both on and off campus.

“I can’t even tell you how many people I’ve smoked with,” he told The Verge.

Because the NDAs made it impossible to discuss their jobs with anyone outside of work, “You get really close to your coworkers really quickly,” one former employee told The Verge. Unsurprisingly, employees would chase relief in sex. According to the report, locations where coworkers were found fornicating included “the bathroom stalls, the stairwells, the parking garage, and the room reserved for lactating mothers.”

Audit Pressure

The final toll paid by moderators is the stress that comes with high job insecurity. The report claims that employees could be fired for “making just a handful of errors a week.” That’s due to Cognizant’s pressure for “accuracy” above all else. Accuracy is measured by how often Facebook’s internal moderators agree with a sample audit of the contractor’s decisions; although Cognizant’s target is 95 percent accuracy, “it usually floats in the high 80s or low 90s,” according to The Verge.

The challenges faced by moderators, which has been covered by Motherboard and others, includes volume, contextual ambiguity, cultural considerations, and the constant evolution and refinement of Facebook’s community standards. At a place like Cognizant, their choices are less about applied lessons in the nuances of digital free speech than chasing a standard for accuracy that’s always out of reach.

The internal pressure to hit accuracy goals sometimes led to conflict between moderators and managers who disagreed with their decisions. Because of the pervasively morbid mindset, one quality assurance supervisor began bringing a concealed gun to work in the event that one of the many bitter former employees would follow through on threats.

Moderator Temperament

Recognizing the heavy mental toll of the job, Cognizant keeps two counselors on site to help employees deal with triggering or traumatic posts. They also incorporate activities like yoga and meditation at work to help moderators manage their stress.

Nevertheless, there’s a certain personality type needed to serve on the frontline of social media defense, just as there are specific traits required by EMTs, police officers and morticians. Until we can train AI to recognize speech and morality with the nuance it can apply to, say, the patterns of bee waste, we’ll need human content moderators. Especially now that Snopes is no longer fact-checking for fake news on Facebook.

If nothing else, The Verge report should serve as a warning for those interviewing for a position in content moderation. Here’s another, courtesy of the German philosopher Friedrich Nietzsche: “Whoever fights monsters should see to it that in the process he does not become a monster. [For] when you look long into an abyss, the abyss also looks into you.”