YouTube Human Content Moderators are back at work

Ruben Fields
September 24, 2020

In an attempt to simultaneously make YouTube a safer platform for younger audiences and to appease their advertisers position on videos, YouTube have been age-restricting videos they deem to contain adult content.

The document says the plaintiff had to watch a video of people eating from a smashed open skull, of a kidnapped woman being beheaded by a cartel, of a fox being skinned alive, of school shootings showing dead children, of a backyard abortion, and of several suicides including a politician shooting himself and another of a man falling to his death from a roof with graphic audio.

The former moderator, who isn't named, is seeking medical treatment, compensation for the trauma she suffered and the creation of a YouTube-funded medical monitoring program that would screen, diagnose and treat content moderators.

Chargers Team Doctor Accidentally Punctured Tyrod Taylor's Lung With Needle Before Game
Taylor also abruptly lost the starting job in Cleveland to No. 1 overall pick Baker Mayfield in Week 4 of the 2018 season. Chargers head coach Anthony Lynn has publicly backed Taylor as the starter , so long as he's healthy.

"Going forward, we will build on our approach of using machine learning to detect content for review, by developing and adapting our technology to help us automatically apply age-restrictions", a recent YouTube blog post confirmed. Most of the videos also did not break any rules of the platform. However, the AI moderators will still filter the videos on hate speech. In May, Facebook agreed to pay $52 million to content moderators as part of a settlement. Also alleged is that YouTube "failed to adequately inform potential content moderators about the negative impact the job could have on their mental health and what it involved", CNET said.

YouTube asserted that the reliance on the AI moderation had led to a surge in video removals and incorrect takedowns, reported Financial Times. However, according to the plaintiff, doing so would put an employees job at risk. The company allows workers to speak with wellness coaches, but the coaches don't have medical expertise and aren't available to moderators who work at night. Another coach told a content moderator to just "trust in God".

A former YouTube content moderator is suing the company, claiming she developed depression and PTSD symptoms from inadequate precautions and overexposure to violent content. As a result, and on a temporary basis, we are using more technology to carry out some of the tasks that flesh and blood reviewers normally do, so we are removing more content that may not violate our policies. Users might be asked to provide proof of their age to access content for over-18s or else might be restricted from viewing any content that requires verification.

Other reports by

Discuss This Article

FOLLOW OUR NEWSPAPER