fbpx

Healthy fod for babies Facebook settles moderator suit for $52M as hate speech on site increases

Healthy fod for babies

unsettling —

Good news: the robots are catching more. Bad news: there’s more to catch.


healthy fod for babies People work at computers in an open office.

Enlarge / Content moderators work at a Facebook office in Austin, Texas.

Many jobs can cause employee burnout, but the effect of having to deal with the absolute worst cruelty humanity has to offer for 40 hours a week can go well beyond burnout and leave employees with serious mental health traumas. Facebook has now settled with a group of content moderators who sued the tech behemoth, alleging their jobs left them with severe post-traumatic stress disorder the company did nothing to mitigate or prevent.

The company will pay $52 million to settle the suit, first filed in 2018 by a content moderator named Selena Scola. Scola’s suit alleged that she developed “debilitating” PTSD after having to watch “thousands of acts of extreme and graphic violence.”

The conditions under which Facebook moderators often work have been extensively reported out by The Guardian, The Verge (more than once), The Washington Post, and BuzzFeed News, among others. Moderators, who mostly work for third-party contract firms, described to reporters hours spent looking at graphic murders, animal cruelty, sexual abuse, child abuse, and other horrifying footage, while being provided with little to no managerial or mental health support and hard-to-meet quotas under shifting guidelines.

“We are so pleased that Facebook worked with us to create an unprecedented program to help people performing work that was unimaginable even a few years ago,” said Steve Williams, an attorney representing the plaintiffs, in a written statement. “The harm that can be suffered from this work is real and severe.”

More than 11,000 current and former content moderators working in Arizona, California, Florida, and Texas will receive at least $1,000 from the settlement. Employees formally diagnosed with mental health conditions such as PTSD or depression may receive an additional $1,500 per diagnosis to pay for treatments, up to $6,000 total per employee. Those with qualifying diagnoses may also be able to submit evidence of other injuries they received as a result of their work for Facebook and receive additional compensation in damages.

As with any class-action suit, the amount any individual can receive may be significantly reduced if the majority of the class applies and qualifies for benefits.

Facebook in its statement said it was “grateful to the people who do this important work,” adding, “we’re committed to providing them additional support through this settlement and in the future.”

Healthy fod for babies Automating the solution

Facebook’s ultimate goal has been to automate as much content moderation as possible. That way, fewer contractors will have to be paid to screen damaging content, and those who still do the work won’t have to watch quite as much of it.

The company’s most recent community-standards enforcement report, released yesterday, indicates that the automated tools are indeed getting better, even if they still have a long way to go.

About 90 percent of the hate speech Facebook removed in the last quarter was detected automatically before being reviewed by someone, the company said. In 2018, that figure stood at about 24 percent, company CEO Mark Zuckerberg said in a call with media, “and up from roughly zero percent the year before that.” All told, the company deleted about 9.6 million incidents of hate speech in the first quarter, as compared to about 5.7 million in the previous report period, and removed about 4.7 million posts from or related to hate groups specifically.

Facebook, like every other platform out there right now, is also grappling with COVID-19 misinformation, which proliferates quickly when you have more than 2.6 billion users. In April, Facebook’s fact-checkers “put about 50 million labels on pieces of content related to COVID-19 based on 7,500 articles,” Zuckerberg said. Apparently, those labels work; about 95 percent of the time, viewers do not click through to content that has been warned to be false.

Leave a comment
Stay up to date
Register now to get updates on promotions and coupons.

Shopping cart

×