Facebook sued by former content moderator for psychological trauma, PTSD | Inquirer Technology

Facebook sued by former content moderator for psychological trauma, PTSD

/ 12:09 PM September 26, 2018

In this March 29, 2018, file photo, the logo for Facebook appears on screens at the Nasdaq MarketSite in New York’s Times Square.  AP/Richard Drew

A former Facebook content moderator has sued the social media company, claiming the harmful nature of her job caused her to suffer from psychological trauma and post-traumatic stress disorder (PTSD).

Selena Scola of San Francisco, California, was employed by Pro Unlimited Inc. and worked as a public content contractor for Facebook from June 2017 to  March 2018.

Article continues after this advertisement

As per the filing in the Superior Court of California in and for San Mateo, Scola witnessed  “thousands of acts of extreme and graphic violence” from her cubicle in Facebook’s Silicon Valley offices. These included images, videos and live broadcasts of rape, murder, torture and child sexual abuse, among others.

FEATURED STORIES

As Facebook users upload millions of images and videos every day, it is Scola’s job to “maintain a sanitized platform” by going through the posts and removing those that violate Facebook’s terms of use. It was reported that content moderators are “asked to review more than 10 million potentially rule-breaking posts per week.”

According to the complaint, Facebook ignored “work place safety standards” despite drafting such to protect content moderators like Scola from workplace trauma.

Article continues after this advertisement

“Instead, the multi-billion dollar corporation affirmatively requires its content moderators to work under conditions known to cause and exacerbate psychological trauma,” the filing read. “By requiring its content moderators to work in dangerous conditions that cause debilitating physical and psychological harm, Facebook violates California law.”

Article continues after this advertisement

On behalf of herself and other content moderators like her, Scola sought to “stop these unlawful and unsafe workplace practices” and “to ensure Facebook and Pro Unlimited provide content moderators with proper mandatory onsite and ongoing mental health treatment and support.”

Article continues after this advertisement

Earlier in July Facebook addressed concerns regarding who views objectionable content on its site. In the statement,  it admitted that reviewing a large amount of content wasn’t easy, as it had never been done before.

However, Facebook wrote then that the teams working on safety and security were “doubling in size this year to 20, 000. This includes our growing team of 7, 500 content reviewers—a mix of full-time employees, contracts and companies we partner with.”

Article continues after this advertisement

Facebook added it had a team of four clinical psychologists tasked with creating and delivering resilience programs to the content moderators. Trained professionals are also available onsite for individual and group counseling.  /ra

RELATED STORIES: 

Facebook announces stricter policy on firearms sales

Your subscription could not be saved. Please try again.
Your subscription has been successful.

Subscribe to our daily newsletter

By providing an email address. I agree to the Terms of Use and acknowledge that I have read the Privacy Policy.

In darkest reaches of cyberspace, danger lurks

TOPICS: California, Facebook, Post-traumatic stress disorder (PTSD), rape, San Mateo
TAGS: California, Facebook, Post-traumatic stress disorder (PTSD), rape, San Mateo

Your subscription could not be saved. Please try again.
Your subscription has been successful.

Subscribe to our newsletter!

By providing an email address. I agree to the Terms of Use and acknowledge that I have read the Privacy Policy.

© Copyright 1997-2024 INQUIRER.net | All Rights Reserved

This is an information message

We use cookies to enhance your experience. By continuing, you agree to our use of cookies. Learn more here.