Facebook Objects to Releasing Private Posts About Myanmar’s Rohingya Campaign

Facebook was used to spread disinformation about the Rohingya, the Muslim ethnic minority in Myanmar, and in 2018 the company began to delete posts, accounts and other content it determined were part of a campaign to incite violence. 

That deleted but stored data is at issue in a case in the United States over whether Facebook should release the information as part of a claim in international court. 

Facebook this week objected to part of a U.S. magistrate judge’s order that could have an impact on how much data internet companies must turn over to investigators examining the role social media played in a variety of international incidents, from the 2017 Rohingya genocide in Myanmar to the 2021 Capitol riot in Washington. 

The judge ruled last month that Facebook had to give information about these deleted accounts to Gambia, the West African nation, which is pursuing a case in the International Court of Justice against Myanmar, seeking to hold the Asian nation responsible for the crime of genocide against the Rohingya.

But in its filing Wednesday, Facebook said the judge’s order “creates grave human rights concerns of its own, leaving internet users’ private content unprotected and thereby susceptible to disclosure — at a provider’s whim — to private litigants, foreign governments, law enforcement, or anyone else.” 

The company said it was not challenging the order when it comes to public information from the accounts, groups and pages it has preserved. It objects to providing “non-public information.” If the order is allowed to stand, it would “impair critical privacy and freedom of expression rights for internet users — not just Facebook users — worldwide, including Americans,” the company said. 

Facebook has argued that providing the deleted posts is in violation of U.S. privacy, citing the Stored Communications Act, the 35-year-old law that established privacy protections in electronic communication. 

Deleted content protected? 

In his September decision, U.S. Magistrate Judge Zia M. Faruqui said that once content is deleted from an online service, it is no longer protected.

Paul Reichler, a lawyer for Gambia, told VOA that Facebook’s concern about privacy is misplaced. 

“Would Hitler have privacy rights that should be protected?” Reichler said in an interview with VOA. “The generals in Myanmar ordered the destruction of a race of people. Should Facebook’s business interests in holding itself out as protecting the privacy rights of these Hitlers prevail over the pursuit of justice?” 

But Orin Kerr, a law professor at the University of California at Berkeley, said on Twitter that the judge’s ruling erred and that the implication of the ruling is that “if a provider moderates contents, all private messages and emails deleted can be freely disclosed and are no longer private.”

The 2017 military crackdown on the Rohingya resulted in more than 700,000 people fleeing their homes to escape mass killings and rapes, a crisis that the United States has called “ethnic cleansing.”

‘Coordinated inauthentic behavior’ 

Human rights advocates say Facebook had been used for years by Myanmar officials to set the stage for the crimes against the Rohingya. 

Frances Haugen, the former Facebook employee who testified about the company in Congress last week, said Facebook’s focus on keeping users engaged on its site contributed to “literally fanning ethnic violence” in countries. 

In 2018, Facebook deleted and banned accounts of key individuals, including the commander in chief of Myanmar’s armed forces and the military’s television network, as well as 438 pages, 17 groups and 160 Facebook and Instagram accounts — what the company called “coordinated inauthentic behavior.” The company estimated 12 million people in Myanmar, a nation of 54 million, followed these accounts. 

Facebook commissioned an independent human rights study  of its role that concluded that prior to 2018, it indeed failed to prevent its service “from being used to foment division and incite offline violence.” 

Facebook kept the data on what it deleted for its own forensic analysis, the company told the court. 

The case comes at a time when law enforcement and governments worldwide increasingly seek information from technology companies about the vast amount of data they collect on users. 

Companies have long cited privacy concerns to protect themselves, said Ari Waldman, a professor of law and computer science at Northeastern University. What’s new is the vast quantity of data that companies now collect, a treasure trove for investigators, law enforcement and government. 

“Private companies have untold amounts of data based on the commodification of what we do,” Waldman said.

Privacy rights should always be balanced with other laws and concerns, such as the pursuit of justice, he added.

Facebook working with the IIMM 

In August 2020, Facebook confirmed that it was working with the Independent Investigative Mechanism for Myanmar (IIMM), a United Nations-backed group that is investigating Myanmar. The U.N. Human Rights Council established the IIMM, or “Myanmar Mechanism,” in September 2018 to collect evidence of the country’s most serious international crimes.

Recently, IIMM told VOA it has been meeting regularly with Facebook employees to gain access to information on the social media network related to its ongoing investigations in the country. 

A spokesperson for IIMM told VOA’s Burmese Service that Facebook “has agreed to voluntarily provide some, but not all, of the material the Mechanism has requested.” 

IIMM head Nicholas Koumjian wrote to VOA that the group is seeking material from Facebook “that we believe is relevant to proving criminal responsibility for serious international crimes committed in Myanmar that fall within our mandate.”  

Facebook told VOA in an email it is cooperating with the U.N. Myanmar investigators. 

“We’ve committed to disclose relevant information to authorities, and over the past year we’ve made voluntary, lawful disclosures to the IIMM and will continue to do so as the case against Myanmar proceeds,” the spokesperson wrote. The company has made what it calls “12 lawful data disclosures” to the IIMM but didn’t provide details. 

Human rights activists are frustrated that Facebook is not doing more to crack down on bad actors who are spreading hate and disinformation on the site.

“Look, I think there are many people at Facebook who want to do the right thing here, and they are working pretty hard,” said Phil Robertson, who covers Asia for Human Rights Watch. “But the reality is, they still need to escalate their efforts. I think that Facebook is more aware of the problems, but it’s also in part because so many people are telling them that they need to do better.” 

Matthew Smith of the human rights organization Fortify Rights, which closely tracked the ethnic cleansing campaign in Myanmar, said the company’s business success indicates it could do a better job of identifying harmful content. 

“Given the company’s own business model of having this massive capacity to deal with massive amounts of data in a coherent and productive way, it stands to reason that the company would absolutely be able to understand and sift through the data points that could be actionable,” Smith said. 

Gambia has until later this month to respond to Facebook’s objections.




leave a reply: