Facebook agrees to $52 million content moderator mental health settlement

0
113
NBC expose on Facebook and privacy

Facebook agreed to pay a $52 million settlement to end a lawsuit brought by its current and former moderators in San Mateo Superior Court Tuesday. The payout will provide 11,250 eligible content screeners with resources to receive mental health treatment.

Facebook Moderator Settlement Details

In 2018, Facebook moderator Selena Scola filed suit against the Big Tech firm, arguing that performing content screening duties for its platform caused her to develop posttraumatic stress syndrome (PTSD). Moderators with similar allegations from Arizona, California, Florida, in Texas soon joined the complaint.

Last year, The Verge published multiple stories detailing the toll that screening content involving sexual assault, self-harm, and violence against animals daily took on Facebook’s moderators.

Advertisement
Manage your supply chain from home with Sourcengine

Initially, the world’s largest social network addressed the issue by providing its independent contractors with raises and on-site counseling. With the lawsuit settlement, the Menlo Park, California-based firm agreed to provide enhanced harm mitigation resources.

Every content screener covered by the suit will receive a minimum payment of $1,000. Moderators diagnosed with one mental health condition are entitled to an additional $1,500, and qualified claimants with more than one concurrent psychological illness can receive up to $6,000.

Steve Williams, an attorney for the plaintiffs, issued a statement reflecting his gratitude at the firm’s decision to settle. “We are so pleased that Facebook worked with us to create an unprecedented program to help people performing work that was unimaginable even a few years ago.”

Operational Changes

Besides the $52 million payout, Facebook agreed to supply its moderators with resources designed to make the work of policing its platform less arduous.

The company’s moderators will now have access to one-on-one counseling sessions with certified mental health professionals. The firm’s screeners can also contact a licensed counselor when in crisis and participate in monthly group therapy sessions. Moreover, the corporation’s vendors will be required to assess applicants’ psychological resiliency when hiring and post mental health support information at each workstation.

Facebook is also rolling out software updates, which will let screeners’ mute audio and de-colorize video footage when reviewing content.

Increased AI Utilization

The social network also revealed its artificial intelligence (AI) enabled filtering tools have become more effective.

At present, Facebook employs 15,000 moderators to keep harmful and misleading content off of its network. The firm maintains a large screener workforce because its AI tools lack the contextual understanding to filter user posts properly. However, the organization has recently made meaningful progress in refining its digital review algorithms.

On Tuesday, the technology corporation announced it removed 2.5 million posts offering personal protective equipment due to price gouging or illegitimate claims.

The firm explained it re-tasked its firearms and drug imaging identification tools to deal with COVID-19 exploiting scammers. The company had to increase its reliance on AI tools as it had problems providing its human moderators with adequate remote work setups. As such, Facebook has given its content screening algorithms new features, like multilingual harmful content identification.

The service’s upgrades have produced impressive outcomes recently. Its AI flagged around 90 percent of content removed since last October. During that time, Facebook pulled down 1.7 billion fraudulent accounts, 7.9 million pro-illegal drug posts, and 39.5 million instances of adult nudity.

Given how deleterious it is for human beings to moderate the posts of 2.36 billion daily users, Facebook’s efforts to improve its autonomous detection programs is heartening.

LEAVE A REPLY

Please enter your comment!
Please enter your name here