Advertisement

YouTube is going to need humans to review sensitive content

Notre Dame fire screw-up shows that algorithm's make mistakes, too.

The Washington Post
April 18, 2019 at 11:00PM

A structure billows with smoke at its tallest point and then crumbles. Any human this week would have recognized this image of destruction as the burning of Notre Dame Cathedral in Paris. A machine at YouTube, however, did not: It saw 9/11 instead.

YouTube's tool for appending accurate information to sensitive content failed Monday when Notre Dame's spire fell from the sky. Instead of providing details about the event alongside three news accounts' live streams, or simply leaving those videos unannotated, YouTube's algorithm added a box explaining the attack on the twin towers more than 17 years ago.

The mishap was at once ironic and instructive. YouTube built its "information panels" to fight misinformation, in response in part to a conspiracy theory accusing a survivor of the Parkland, Fla., high school shooting in February 2018 of being a "crisis actor" that ended up trending on its platform. Yet by linking what appears to have been an awful accident to terrorism, the panels promoted hoaxes and confusion across social media sites. There's a lesson there: As platforms finally start to take responsibility for their role in curating what appears on their turf, they must recognize that real responsibility means that for the foreseeable future, humans — not only machines — will have to do much of the work.

Though YouTube pledged in late 2017 to have more than 10,000 moderators, and Facebook reportedly has about 15,000, some continue to insist that algorithms are the eventual answer to the scourge of illegal and otherwise dangerous content. YouTube's decision to base its information panel project on the judgments of computers was a case in point. The company was trying, rightly, to correct for users' fondness for hoaxes — but instead of involving humans in the process, it gave computers the job.

Certainly, human review cannot possibly be applied to every addition to a worldwide open platform the way editors watch over traditional media. That is the price we pay for access to such far-reaching stores of information. But as effective and efficient as machines may be at enforcing basic rules, and as essential as they are for triaging inundations of posts and uploads, there are some things they may never do "better" than we can with our own flawed minds. That's especially true in areas where context is key, such as hate speech, conspiracy theories and, yes, breaking news events.

FROM AN EDITORIAL IN THE WASHINGTON POST

Advertisement
about the writer

about the writer

Editorial

More from Commentaries

See More
A sign says "vote here" outside a polling place entrance in St. Paul with an American flag above it.

Some candidates may not have announced they’re running for office until the filing period that closed Tuesday. That shouldn’t be seen as a weakness.

Hennepin County Attorney Mary Moriarty addressed ways of improving traffic safety including preventive measures at the H.C. Government Center in Minneapolis, Minn., on Wednesday, June 25, 2025.
Students hang out in the cafeteria at Roosevelt High School in Minneapolis on Sept. 16, 2016.
Advertisement
Advertisement

To leave a comment, .

Advertisement