HALF MOON BAY, Calif. — In Myanmar, posts on Facebook have spread hate across the country, and critics say the response by the company has been inadequate.
“Before Facebook introduces a gun, they need to educate people, so they can learn how this gun can be used safely,” said Aung Kyaw Moe, director of the Center for Social Integrity in Myanmar.
“We can and should do more.”— Alex Warofka, product policy manager, Facebook
He said the company needs to do more to promote digital and media literacy in a country where Facebook is dominant and makes up the entire online experience for many people, providing them with little context to separate fact from fiction.
While the stated mission of Facebook is to make the world more open and connected, Myanmar is one of a growing number of examples of the dark side of the platform, as the service is exploited across many of the 133 languages in which it operates.
Facebook commissioned a report from Business for Social Responsibility that outlines how the platform became a tool for government propaganda in the genocidal violence against the Rohingya minority in Myanmar and what the company should do about it. In the past, Facebook has not taken responsibility for the content its users post. But critics tell Devex the Myanmar example highlights the need to go beyond technical fixes to stop the spread of propaganda, fake news, and hate on its platform.
Changing the model
The report from BSR builds on criticisms of Facebook by the United Nations, which accused the company of being slow to act, after it became clear the military in Myanmar used the platform to accelerate the ethnic cleansing of the Rohingya.
“Before entering any new market, particularly those with volatile ethnic, religious or other social tensions, Facebook and other social media platforms, including messenger systems, should conduct in-depth human rights impact assessments for their products, policies, and operations, based on the national context and take mitigating measures to reduce risks as much as possible,” reads a report from the United Nations Human Rights Council.
That report outlines the risks of Facebook entering any region without an understanding of the local context and plans for moderating content, but others say there needs to be a change from headquarters, where some say leaders are in over their heads.
Talking about the 43,000 Rohingya missing and presumed dead, McNamee said Facebook needs to take responsibility for the role it played in Myanmar, and cannot recover unless it changes its business model, which he said is built on surveillance, and is dangerous for society.
“This is on their head, and yet they culturally believe in what they’re doing so intensely, that they have not been able to accept that they are responsible for things like that,” he said.
‘We can and should do more’
Facebook said it agrees with the findings of the report from BSR assessing its human rights impact in Myanmar.
“The report concludes that, prior to this year, we weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence,” Alex Warofka, product policy manager at Facebook, said in a statement. “We agree that we can and should do more.”
Warofka outlined steps the company is taking in each of the five areas BSR recommends: adopting a human rights policy, establishing governance structures to oversee it, and providing updates on progress; improving enforcement of its community standards; playing a greater role in advocacy for reform in Myanmar; sharing data that would be useful in tracking international human rights violations; and mitigating future risks, particularly in the lead up to the country’s 2020 elections.
“We're seeing progress,” Warofka told Devex via email. “In the third quarter of 2018, we saw our numbers improve: We took action on approximately 64,000 pieces of content in Myanmar for violating our hate speech policies.” He added that in that period, Facebook proactively identified 63 percent — up from 13 percent in the last quarter of 2017 and 52 percent in the second quarter of 2018.
Recently, the company has taken steps to educate users about its policies, even posting them in cartoon form for people in Myanmar who cannot read.
“Facebook executes the default playbook when things go awry: damage control.”— Michael Lwin, co-founder, Koe Koe Tech
Facebook has removed the accounts of a number of Myanmar military officials, hired 99 native Myanmar language speakers to review content, and is now working to support the transition from Zawgyi — the typeface used to display Burmese — to the international text encoding standard Unicode, which will make violations easier to track.
The BSR report recommends that Facebook establish a separate policy defining its approach to content moderation with respect to human rights. The company is looking into it, Warofka said. Meanwhile, it is hiring more human rights specialists and working more closely with NGOs, academia, and international organizations.
Michael Lwin, the co-founder of Yangon-based social enterprise Koe Koe Tech, was consulted for the BSR report and talks frequently with Facebook staff about the situation in Myanmar.
“Facebook executes the default playbook when things go awry: damage control,” he said. “I think it would be much better if big tech companies owned up to their gaps when things go awry, and then reached out to and hired experts in the relevant areas.”
He suggests the company hire cultural anthropologists as well as academics and lawyers specializing in online speech, as well as a high-level executive with a handle on human rights who would have veto power over CEO Mark Zuckerberg and COO Sheryl Sandberg on issues of fake news and hate speech.
Some of the stakeholders BSR interviewed for its report suggested that Facebook locate staff in Myanmar, rather than nearby countries such as Singapore, but Warofka told Devex Facebook has no plans to set up an office in the country.
“As the BSR report notes, there would be real risks involved in doing so, including the potential for increased government leverage on content and data requests as well as potential risks to our employees,” he wrote. “However, we continue to invest in programs, including regular training for civil society and community groups on using our tools, and in technology that we believe will improve people's experience on Facebook in Myanmar.”
Moe of the Center for Social Integrity in Myanmar explains that Facebook should have staff in-country because the company is doing business there and needs to mitigate the risks that come with that.
As the company works to bring more people who have never been online onto their platform, the question becomes how lessons from Myanmar might be applied in other contexts.
McNamee said Facebook appears to be working through a list of things that went wrong instead of anticipating issues that might arise in the future, such as how algorithms favoring posts that get attention deal with posts that are fueling fear and anger.
Until Facebook changes its business model, the best it can do is “play whack-a-mole” against problems that may seem isolated but are really systemic, he said.