Jakarta, Indonesia – Years after being investigated for contributing to ethnic and religious violence in Myanmar, Facebook still struggles to detect and moderate hate speech and misinformation on its platform in the Southeast Asian country. Faced with, internal documents viewed by the Associated Press.
Three years ago, the company released a report which found that Facebook was being used to “stir up divisions and incite offline violence” in the country. He promised to do better and made more progress..
But violations persist – and even exploited by hostile actors – since February 1Nationwide.
Today, scrolling through Facebook, it is not difficult to find posts threatening to kill and rape in Myanmar. On October 24, a 2 1/2 minute video calling for violence against opposition groups by an army supporter received more than 56,000 views.
“From now on, we are all gods of death for (them),” the man says in Burmese, looking at the camera. “Come tomorrow and see if you are a real man or gay.”
One account posted a military defender’s home address and a photo of his wife. Another October 29 post includes a photo of soldiers walking down the dirt road blindfolded and blindfolded. The Burmese caption reads, “Don’t catch them alive.”
Despite the ongoing problems, Facebook saw its operations in Myanmar as a model and an emerging and caustic case for exporting worldwide. Documents reviewed by the AP show that Myanmar has become a testing ground for new content moderation technology, with automated methods of detecting hate speech and misinformation through major test methods on social media. Can be built with different levels of success.
Facebook’s internal conversations about Myanmar were revealed in revelations made to the Securities and Exchange Commission and provided to Congress in a modified form by the former.The revised versions received by Congress were received by a consortium of news outlets, including The Associated Press.
Facebook’s history in Myanmar has been shorter but more volatile than most countries. After decades of censorship under military rule, Myanmar became connected to the Internet in 2000. Shortly afterwards, Facebook teamed up with telecom providers in the country, allowing users to use the platform without the need to pay for data, which was still expensive at the time. . The use of the platform exploded. For many people in Myanmar, Facebook has become the Internet itself.
Htaike Htaike Aung, a lawyer for Myanmar’s Internet policy, said it also became a “stronghold of extremism” around 2013, when religious riots broke out between Buddhists and Muslims across Myanmar. It is unknown at this time what he will do after leaving the post.
Htaike Htaike Aung said he met with Facebook the same year and described the problems in the country, including how local organizations were seeing the unusual amount of hate speech on the platform and how. Preventive measures, such as reporting posts, were not working in the context of Myanmar. .
An example of this was a picture of a pile of bamboo sticks with the caption, “Let’s be prepared because there is going to be a riot inside the Muslim community.”
Htaike Htaike Aung said the photo was reported to Facebook, but the company did not take it down because it did not violate any of the company’s community standards.
“Which is ridiculous because it was actually calling for violence. But Facebook didn’t see it that way,” he said.
Years later, the lack of moderation caught the attention of the international community. In March 2018, UN human rights experts investigating attacks on Myanmar’s Muslim Rohingya minority said Facebook had played a role in spreading hate speech.
When asked about Myanmar during a US Senate hearing a month later, CEO Mark Zuckerberg replied that Facebook plans to hire “dozens” of Burmese speakers to moderate the content. Will work with civil society groups to identify hate figures and develop new technologies to combat hate speech. .
“Hate speech is very specific to the language. It’s hard to do without native speakers, and we need to dramatically step up our efforts there,” Zuckerberg said.
Facebook’s internal documents show that while the company stepped up its efforts to combat hate speech, the tools and strategies to do so never fully worked, and people inside the company The alarm went off. In a May 2020 document, an employee stated that the hate speech text classification that was available was not being used or maintained. Another document a month later states that there is a “significant gap” in the detection of misinformation in Myanmar.
Ronan Lee, a visiting scholar at the International State Crime Initiative at Queen Mary University of London, said: “Facebook has taken symbolic steps. No need to look in depth. “
In a statement emailed to the AP, Raphael Frankels, Facebook’s policy director for APAC Emerging Countries, said the platform had “formed a dedicated team of more than 100 Burmese speakers,” but Denied how many people were employed. Online marketing company NapoleonCat estimates that there are approximately 28.7 million Facebook users in Myanmar.
During his testimony in the EU Parliament on November 8, Whistleblower, Hogan criticized Facebook for its lack of investment in third-party fact-finding, and its reliance on automated systems to detect harmful content. Of
“If you focus on these automated systems, they will not work for the most ethnically diverse places in the world, with the most linguistically diverse places in the world, which are often the most delicate. He was referring to Myanmar.
Following Zuckerberg’s testimony to the 2018 congress, Facebook developed digital tools to combat hate speech and misinformation, and created a new internal framework for handling crises around the world, such as Myanmar.
Facebook has compiled a list of “endangered countries” with rankings for the “Critical Countries team” to focus on their energy, as well as languages that are listed. Needs more moderation in content. Myanmar was listed as a “Tier 1” endangered country, with Burmese considered the “preferred language” along with Ethiopian languages, Bengali, Arabic and Urdu.
Facebook engineers taught their automated systems Burmese words for “Muslims” and “Rohingya”. It also trained systems to detect “integrated unauthorized behavior”, such as the same person posting from multiple accounts, or coordination between different accounts to post the same content.
The company also tried “repeat offender demotion” to reduce the impact of user posts that often violate the guidelines. In one of the two most volatile countries in the world, Demonstration did well in Ethiopia, but poorly in Myanmar – a difference that affected engineers, according to a 2020 report included in the document.
“We’re not sure why; but this information provides a starting point for further analysis and user research,” the report said. Facebook declined to comment on the record, saying the problem was fixed a year after it was discovered, or about the success of the two tools in Myanmar.