Facebook and the Rohingya Genocide
The Rohingya are a mixed ethno-religious group that have lived in Myanmar’s Rakhine province for centuries. The Rohingya are mostly Muslim, though a minority of Hindus exist among their number. Both religious identities are vastly outnumbered by the 88% Buddhist population of Myanmar. Despite their long residence in the area, the Rohingya are not among the eight major ethnic groups recognized by the government. Instead, the Burmese government has systematically worked to strip the Rohingya of citizenship, characterizing them as ethnic and religious outsiders, chiefly referred to as “Bengalis.”
Stringent restrictions on mobility, employment, and eventually voting rights left the now-stateless Rohingya completely disenfranchised over a period of decades, leading them to be labeled “the most persecuted people in the world.” Amnesty International and Desmond Tutu described the Burmese treatment of the Rohingya as apartheid.
In 2016, men with knives and sharpened sticks attacked police outposts on the Burmese and Bangladesh borders, killing a handful of officers. The Arakan Rohingya Salvation Army (ARSA) claimed responsibility for these attacks as protest against the harms suffered by the Rohingya. The Burmese military retaliated with a full-scale pogrom against the Rohingya, culminating in the present day.
The military instantiated a reign of terror, using murder, rape, and torture against this already battered people. 392 villages were partially or wholly destroyed, while an estimated 10,000 Rohingya deaths are considered to be a conservative estimate of the bloodshed. 723,000 of the Rohingya (according to the UN’s count) have fled to neighboring countries.
Recent UN reports found evidence of a concerted, premeditated effort on the part of Burmese generals to engage in ethnic cleansing. Aung San Suu Kyi, leader of Myanmar and 1991 Nobel Peace Prize winner who herself once suffered at the hands of the Burmese military, has long ignored or denied atrocities against the Rohingya, eliciting international censure. Buddhists monks like Ashin Wirathu, though idealized in the Western imagination as the Platonic realization of the pacifist, play a significant role in advocating violence against Muslims in the name of Buddhist nationalism.
In this systematic decimation of the Rohingya, Burmese authorities found help from a surprising quarter: Facebook. The UN ascribed a fundamental role to Facebook for the dissemination of hate and disinformation. For most people in Myanmar, Facebook is the only source of information. It was thus easy for military generals to deploy Facebook as a covert propaganda tool. Their efforts reached 12 million users (a large chunk of the national population of 51 million). Recently, in response to intense international scrutiny, Facebook finally announced that it was removing the accounts of twenty military individuals and organizations, provoking a greater outcry among the Burmese than the Rohingya genocide itself.
Facebook hate speech throughout the Burmese ethnic cleansing was not just a concerted military operation. It flourished among political parties in Myanmar. An analysis by Buzzfeed News found that, of four thousand posts by Burmese politicians, one in ten contained hate speech that violated Facebook’s community standards. Examples included “othering” comments comparing Rohingya to animals, misogynistic statements against Muslim women saying that they were “too ugly to rape,” claims that the Rohingya faked their tragedies and that Muslims were seeking to out-populate Buddhists, and direct threats of bloodshed. After months of inaction, when confronted by a Buzzfeed representative, Facebook finally began to take some of these posts down.
It is surprising that, in the words of writer Casey Newton, it took “a genocide to remove someone from Facebook.” It is slightly less surprising in light of Facebook’s policies and track record on dealing with hate speech on scales less than genocide. Through numerous shared user experiences, we see a picture forming of Facebook’s extraordinarily crude application of their officially “neutral” policy. Within the less extreme North American context, women regularly get suspended by Facebook administrators for calling out men who threaten them with rape and violence (while their harassers suffer no consequences). Meanwhile, black children are not a protected group, although white men are. Danielle Citron, law professor at the University of Maryland and expert on information privacy, notes that Facebook’s context-blind algorithms purporting to curb hate speech ultimately serve to “protect the people who least need it and take it away from those who really need it.”
Facebook possess the resources to hire experts on best practices in regulating hate speech and propaganda, even in highly volatile contexts. And yet, the social media platform falls wide of the mark in confronting hate speech, harassment, and disinformation even in stable democracies. What is holding them back?
Facebook’s culpable vulnerabilities to becoming a propaganda machine and fuel for unsavory regimes will continue unless civil society devises clear norms to demand of it and other social media platforms. We must work to translate social, scientific, and political knowledge about how hate and violence are generated in local contexts. We must also establish minimum standards for internal oversight on social media so that plausible deniability on the part of corporations can no longer be an option. Facebook is a reminder that corporations are not guided by the advancement of humankind but by markets and users. Being indifferent to outcomes, their platforms can nurture community building, the spread of knowledge, and skill-building, or they can foster intense group identification, disinformation, hatred, and government propaganda. As Facebook is currently the global giant of social media, synonymous with the Internet itself in Myanmar, it is up to us as members of the international community to hold them accountable with other players in this tragedy.