Opinion editor's note: This article was written for the New York Times by the co-chairs of Facebook's oversight board. Their names are listed below.
Social media affects people's lives in many ways, good and bad. Right now, as the world endures a health crisis, social media has become a lifeline for many people, providing valuable information and helping families and communities stay connected.
At the same time, we know that social media can spread speech that is hateful, harmful and deceitful. In recent years, the question of what content should stay up or come down on platforms like Facebook, and who should decide this, has become increasingly urgent.
So in November 2018, recognizing that no company should settle these issues alone, Facebook committed to creating an independent oversight body that will review Facebook's decisions about what content to take down or leave up. Over the past 18 months, more than 2,000 experts and other relevant parties from 88 countries have contributed feedback that has shaped the development of this oversight board, which will have 20 members and is scheduled to become operational this year.
The oversight board will focus on the most challenging content issues for Facebook, including in areas such as hate speech, harassment and protecting people's safety and privacy. It will make final and binding decisions on whether specific content should be allowed or removed from Facebook and Instagram (which Facebook owns).
The first set of members of the oversight board is now announced. We are the four co-chairs. After Facebook selected us, we considered a large number of individuals for the oversight board, including those recommended by the public, before we interviewed and ultimately approved the 16 other members.
The board members come from different professional, cultural and religious backgrounds and have various political viewpoints. Some of us have been publicly critical of Facebook; some of us haven't. But all of us have training and experience that can help the board in considering the most significant content decisions facing online communities. We are all independent of Facebook. And we are all committed to freedom of expression within the framework of international norms of human rights. We will make decisions based on those principles and on the effects on Facebook users and society, without regard to the economic, political or reputational interests of the company.
Our independent judgment is guaranteed by our structure. The oversight board's operations are funded by a $130 million trust fund that is completely independent of Facebook and cannot be revoked. Board members will serve fixed terms of three years, up to a maximum of three terms; they contract directly with the oversight board. We cannot be removed by Facebook. Through the founding bylaws of the oversight board, Facebook has committed to carrying out our decisions even though it may at times disagree, unless doing so would violate the law. Facebook's chief executive, Mark Zuckerberg, has also personally committed to this arrangement.