The social media giant Facebook and one of its contractors are being sued by a former content moderator who claims post-traumatic stress from seeing child porn, murder, and other graphic violence all day as she worked.

Former moderator Selena Scola, who worked for Facebook via a contractor at their headquarters in Silicon Valley for nine months claims “Every day, Facebook users post millions of videos, images, and livestreamed broadcasts of child sexual abuse, rape, torture, bestiality, beheadings, suicide, and murder" in her lawsuit, which was filed on Friday in California.

“To maintain a sanitized platform, maximize its already vast profits, and cultivate its public image, Facebook relies on people like Ms. Scola — known as ‘content moderators’ — to view those posts and remove any that violate the corporation’s terms of use," the suit added.

take our poll - story continues below

Who should replace Nikki Haley as our ambassador to the U.N.?

  • Who should replace Nikki Haley as our ambassador to the U.N.?  

  • This field is for validation purposes and should be left unchanged.
Completing this poll grants you access to Freedom Outpost updates free of charge. You may opt out at anytime. You also agree to this site's Privacy Policy and Terms of Use.

“Facebook is ignoring its duty to provide a safe workplace and instead creating a revolving door of contractors who are irreparably traumatized by what they witnessed on the job,” said Scola's attorney Korey Nelson in a statement.

The Los Angeles Times reports:

Content moderators tasked with removing posts that violate Facebook’s terms of use watch videos and livestreams of “child sexual abuse, rape, torture, bestiality, beheadings, suicide and murder,” according to the complaint.

“From her cubicle in Facebook’s Silicon Valley offices, Ms. Scola witnessed thousands of acts of extreme and graphic violence,” the suit alleges.

Facebook’s chief executive, Mark Zuckerberg, acknowledged that some people were using the platform to broadcast self-harm last year.

“Just last week, we got a report that someone on [Facebook] Live was considering suicide,” he wrote. “We immediately reached out to law enforcement, and they were able to prevent him from hurting himself. In other cases, we weren’t so fortunate.”

Facebook and other internet service providers voluntarily established industry standards for training, counseling and supporting content moderators more than a decade ago, attorneys said. The lawsuit claims Facebook does not follow the workplace safety guidelines it helped create.

The suit asks that both Facebook and Pro Unlimited fund a medical monitoring program that would help diagnose and treat Scola as well as other content moderators for psychological injuries, including PTSD.

The New York Post adds:

Scola and her colleagues were required to view more than 10 million potentially objectionable posts every week, but the companies failed to protect them from the “psychological trauma” of doing so, the suit alleges.

Industry standards that Facebook helped write include limiting moderators’ exposure to child porn — or allowing them to opt out of seeing it altogether — and providing mandatory counseling, but Scola claims the company failed to put these policies into practice.

Bertie Thomson, director of corporate communications at Facebook, issued a statement to The Register.

“We are currently reviewing this claim. We recognize that this work can often be difficult. That is why we take the support of our content moderators incredibly seriously, starting with their training, the benefits they receive, and ensuring that every person reviewing Facebook content is offered psychological support and wellness resources."

"Facebook employees receive these in house and we also require companies that we partner with for content review to provide resources and psychological support, including onsite counseling (available at the location where the plaintiff worked) and other wellness resources like relaxation areas at many of our larger facilities,” he concluded.

So, while Facebook has gone from being a platform to post content to being content supervisors that seek to sanitize news and other commentary to their liking, they have now incurred the additional task of failing to care for their own people in subjecting them to these images and videos.

While I'm not condoning such images or videos, it is interesting to note that there are literally tens of thousands of these moderators censoring way more than these graphic, and in some cases illegal posts, no one can seem to find time to respond to those of us who get banned or cited for violating community standards by simple statements of fact.

While Scola's lawsuit seems somewhat frivolous, considering that she would apparently understand the kind of content she would be reviewing, I have to ask a more pressing question:  Is Facebook reporting to authorities criminal posts, especially that of child abuse, child porn and other pictures and videos involving children?  If not, aren't they being complicit in protecting those with illegal images and videos?

Article posted with permission from Sons Of Liberty Media

Don't forget to Like Freedom Outpost on Facebook, Google Plus, & Twitter. You can also get Freedom Outpost delivered to your Amazon Kindle device here.