Does YouTube Really Take Down Toxic Videos on the Platform?

YouTube Executives Ignored Warnings, Letting Toxic Videos Run Rampant, reports Bloomberg.

It’s not a secret that YouTube began making changes in late 2016 and started to demonetize channels that promoted toxic content in 2017. But at the end of 2017, only twenty people were employed on its “trust and safety” team. For more details, you can read Bloomberg report of how staffers fought to ensure that controversial videos do not become viral.

According to more than 20 former and current YouTube employees, staff will offer suggestions for limiting video distribution, which contain disturbing, extremist content or conspiracy theories, but reportedly the management was more interested in increasing participation than warnings.

Yonatan Zunger, a privacy engineer at Google, who left the company in 2016, proposed to keep videos that were allowed to stay on YouTube, but, because they were “close to the line” of the takedown policy, would be removed from recommendations. But his proposal was turned down. He was not only one who left the company, but more than five senior employees have also gone over its unwillingness to solve the problem.

As another former employee, said, “YouTube CEO Susan Wojcicki would never keep her finger on the pulse”, saying that her goal was simply to “manage the company” and not fight the onslaught of misinformation and harmful content.

Last year, YouTube tried to curb false news and conspiracies from spreading on its platform with an information block, and this year it began to extract advertisements from potentially toxic content. Nevertheless, YoyTube should fix the main problem of content moderation, since toxic content is still widely distributed on the site. Only prevent the distribution of controversial videos is not enough.