Opinion editor’s note: This article was written for the New York Times by the co-chairs of Facebook’s oversight board. Their names are listed below.
Social media affects people’s lives in many ways, good and bad. Right now, as the world endures a health crisis, social media has become a lifeline for many people, providing valuable information and helping families and communities stay connected.
At the same time, we know that social media can spread speech that is hateful, harmful and deceitful. In recent years, the question of what content should stay up or come down on platforms like Facebook, and who should decide this, has become increasingly urgent.
So in November 2018, recognizing that no company should settle these issues alone, Facebook committed to creating an independent oversight body that will review Facebook’s decisions about what content to take down or leave up. Over the past 18 months, more than 2,000 experts and other relevant parties from 88 countries have contributed feedback that has shaped the development of this oversight board, which will have 20 members and is scheduled to become operational this year.
The oversight board will focus on the most challenging content issues for Facebook, including in areas such as hate speech, harassment and protecting people’s safety and privacy. It will make final and binding decisions on whether specific content should be allowed or removed from Facebook and Instagram (which Facebook owns).
The first set of members of the oversight board is now announced. We are the four co-chairs. After Facebook selected us, we considered a large number of individuals for the oversight board, including those recommended by the public, before we interviewed and ultimately approved the 16 other members.
The board members come from different professional, cultural and religious backgrounds and have various political viewpoints. Some of us have been publicly critical of Facebook; some of us haven’t. But all of us have training and experience that can help the board in considering the most significant content decisions facing online communities. We are all independent of Facebook. And we are all committed to freedom of expression within the framework of international norms of human rights. We will make decisions based on those principles and on the effects on Facebook users and society, without regard to the economic, political or reputational interests of the company.
Our independent judgment is guaranteed by our structure. The oversight board’s operations are funded by a $130 million trust fund that is completely independent of Facebook and cannot be revoked. Board members will serve fixed terms of three years, up to a maximum of three terms; they contract directly with the oversight board. We cannot be removed by Facebook. Through the founding bylaws of the oversight board, Facebook has committed to carrying out our decisions even though it may at times disagree, unless doing so would violate the law. Facebook’s chief executive, Mark Zuckerberg, has also personally committed to this arrangement.
The entire process is designed with transparency in mind. All of the oversight board’s decisions and recommendations will be made public, and Facebook must respond publicly to them.
We have also worked to create a system that is accessible to people. Users will be able to appeal to the oversight board if they disagree with Facebook’s initial decision about whether to take down or leave up a given piece of content, and Facebook can also refer cases to the board. (In the initial phase users will be able to appeal to the board only in cases where Facebook has removed their content, but over the next months we will add the opportunity to review appeals from users who want Facebook to remove content.)
We will not be able to offer a ruling on every one of the many thousands of cases that we expect to be shared with us each year. We will focus on identifying cases that have a real-world impact, are important for public discourse and raise questions about current Facebook policies. Cases that examine the line between satire and hate speech, the spread of graphic content after tragic events, and whether manipulated content posted by public figures should be treated differently from other content are just some of those that may come before the board.
Over the coming months, we will lay out how we prioritize and select cases for review. Once a case has been chosen, it will be considered by a panel with a rotating set of members. All panel decisions will be reviewed by the entire board before they are finalized, and if a majority of members disagree with a decision, they can ask a new panel to hear the case again.
The oversight board cannot, of course, address every concern that people may have with Facebook. Policymakers, regulators and those who have a stake in the effects of technology on our society all continue to have a critical role to play. We also know that we will not be able to please everyone. Some of our decisions may prove controversial and all will spur further debate.
But we speak for all the members of the oversight board when we say that we are committed to demonstrating the value of an independent, principled and transparent oversight process and to serving the online community.
The authors are: Catalina Botero-Marino, a former special rapporteur on freedom of expression of the Organization of the American States; Jamal Greene, a law professor at Columbia University; Michael W. McConnell, a law professor at Stanford University; and Helle Thorning-Schmidt, a former prime minister of Denmark.