Advertisement
This Article is From Sep 25, 2018

Facebook Sued Over "Exposure To Disturbing Images" That Caused Trauma

Facebook has moderators to determine if posts violate its rules against violence, hate speech, child exploitation, nudity and disinformation.

Facebook Sued Over "Exposure To Disturbing Images" That Caused Trauma
A former content moderator says she was exposed to disturbing images at Facebook. (Representational)
San Francisco:

A former Facebook content moderator is suing the company on the grounds that reviewing disturbing material on a daily basis caused her psychological and physical harm, according to a lawsuit filed Monday in a California superior court.

The suit by former moderator Selena Scolla, who worked at Facebook from June 2017 until March, alleges that she witnessed thousands of acts of extreme and graphic violence "from her cubicle in Facebook's Silicon Valley offices," where Scolla was charged with enforcing Facebook's extensive rules prohibiting certain types of content on its systems.

Scola, who worked at Facebook through a third party contracting company, developed post traumatic stress disorder "as a result of constant and unmitigated exposure to highly toxic and extremely disturbing images at the workplace," the suit says.

Facebook didn't respond to a request for comment.

Facebook relies on thousands of moderators to determine if posts violate its rules against violence, hate speech, child exploitation, nudity and disinformation. Many objectionable categories comes with their own sublists of exceptions. It is staffing up its global workforce - hiring 20,000 content moderators and other safety specialists in places like Dublin, Ireland, Austin, Texas, and the Philippines - in response to allegations that the company has not done enough to combat abuse of its services, including Russian meddling, illegal drug content and fake news.

The social network says that in recent years it has been developing artificial intelligence to spot problematic posts, but the technology isn't sophisticated enough to replace the need for significant amounts of human labor.

Facebook is under intense scrutiny from politicians and lawmakers, who have taken top executives to task in two high-profile hearings on Capitol Hill this year and are considering new regulations that would hold the companies to a more stringent standard of responsibility for illegal content posted on their platforms.

The complaint also charges the Boca Raton, Florida-based contracting company, Pro Unlimited, Inc., with violating California workplace safety standards.

Pro Unlimited didn't respond to a request for comment.

The lawsuit does not go into further detail about Ms. Scola's particular experience because she signed a non-disclosure agreement that limits what employees can say about their time on the job. Such agreements are standard in the tech industry, and Ms. Scola fears retaliation if she violated it, the suit says. Her attorneys plan to dispute the NDA, but are holding off on providing further detail until a judge weighs in.

The suit notes that Facebook is one of the leading companies in an industry-wide consortium that has developed workplace safety standards for the moderation field. The complaint alleges that Facebook does not uphold the standards it helped developed, unlike industry peers.

In 2017, two former content moderators also sued Microsoft, claiming that they developed PTSD and that the company did not provide adequate psychological support.

It asks that Facebook and its third party outsourcing companies provide content moderators with proper mandatory onsite and ongoing mental health treatment and support, and establish a medical monitoring fund for testing and providing mental health treatment to former and current moderators.

(Except for the headline, this story has not been edited by NDTV staff and is published from a syndicated feed.)

Track Latest News Live on NDTV.com and get news updates from India and around the world

Follow us:
Listen to the latest songs, only on JioSaavn.com