Facebook has long promoted itself as a tool for bringing people together to make the world a better place. Now the social media giant has acknowledged that in Myanmar it did the opposite, and human rights groups say it has a lot of work to do to fix that.
Facebook failed to prevent its platform from being used to “foment division and incite offline violence” in the country, an executive said Monday, citing a human rights report the company commissioned.
“We agree that we can and should do more,” wrote Alex Warofka, a Facebook product policy manager. He also said Facebook would invest resources in addressing the abuse of its platform in Myanmar that the report outlines.
The report, by BSR, or Business for Social Responsibility, paints a picture of a company that was unaware of its potential for doing harm and did little to figure out the facts on the ground. The report details how Facebook unwittingly entered a country new to the digital era and emerging from decades of censorship, all the while plagued by political and social divisions.
But the report fails to look closely at how Facebook employees missed a crescendo of posts and misinformation that helped to fuel violence in Myanmar, particularly against minority Rohingya Muslims. The report recommended that Facebook increase enforcement of policies for content posted on its platform, exercise greater transparency about its progress and engage with civil society and officials in Myanmar.
Some Facebook detractors criticized the company Tuesday for releasing the report on the eve of the U.S. midterm elections. Facebook said it had previously committed to publishing the report at this time.
The company still faces scrutiny from lawmakers who say it is not doing enough. In some countries, Facebook’s experiments have helped to amplify fake stories, while its slower response in other countries, including Sri Lanka and the Philippines, has allowed rumors to spark violence or spread hate speech.
Human rights groups said Facebook’s pledge needed to be followed up with action. Phil Robertson, deputy Asia director for Human Rights Watch, said Facebook’s actions in Myanmar would be “the acid test” to see if it becomes “a responsible platform manager with its own enforceable code of conduct.”