YouTube announced plans Wednesday to remove thousands of videos and channels that advocate for neo-Nazism, white supremacy and other bigoted ideologies in an attempt to clean up extremism and hate speech on its popular service.
The new policy will ban “videos alleging that a group is superior in order to justify discrimination, segregation or exclusion,” the company said in a blog post. The prohibition will also cover videos denying that violent incidents, like the mass shooting at Sandy Hook Elementary School in Connecticut, took place.
YouTube did not name any specific channels or videos that would be banned.
“It’s our responsibility to protect that, and prevent our platform from being used to incite hatred, harassment, discrimination and violence,” the company said in the blog post.
The decision by YouTube, which is owned by Google, is the latest action by a Silicon Valley company to stem the spread of hate speech and disinformation on its site. A month ago, Facebook evicted seven of its most controversial users, including Alex Jones, the conspiracy theorist and founder of InfoWars. Twitter banned Jones last year.
The companies have come under intense criticism for their delayed reaction to the spread of hateful and false content. At the same time, President Donald Trump and others argue that the giant tech platforms censor right-wing opinions, and the new policies put in place by the companies have inflamed those debates.
The tension was evident Tuesday, when YouTube said that a prominent right-wing creator who used racial language and homophobic slurs to harass a journalist in videos on YouTube did not violate its policies. The decision set off a firestorm online, including accusations that YouTube was giving a free pass to some of its popular creators.
The decisions illustrated a central theme that has defined the moderation struggles of social media companies: Making rules is often easier than enforcing them.
“This is an important and long-overdue change,” Becca Lewis, a research affiliate at the nonprofit organization Data & Society, said about the new policy. “However, YouTube has often executed its community guidelines unevenly, so it remains to be seen how effective these updates will be.”
YouTube’s scale — more than 500 hours of new videos are uploaded every minute — has made it difficult for the company to track rule violations.