A full-blown genocide occurs as members of the majority race mercilessly kill, maim and rape those from the oppressed minority. Politicians of the ruling party openly denigrate minorities, likening them to diseases, vermin, and encouraging the populace to mobilize and cut them down. They are spurred on by conspiracy theories and fake news, spread like wildfire through various channels by people in positions of power. By the time the dust settles, 800,000 people are dead and 2 million more are refugees in neighboring countries.
The year is 1994 and the country is Rwanda – but the description could just as easily have been of Myanmar in 2016 and 2017. In both of these genocides, the ability to propagate false narratives to fan the flames of racial hatred (in the case of Myanmar, both racial and religious hatred) is a powerful tool for those with genocidal motives. The only difference is that while the preferred mediums in 1994 were radio stations and newspapers, today hatred is capable of reaching a wider audience instantaneously, thanks to the power of social media and smartphones.
Human rights advocates have long warned of an orchestrated campaign of hate against the Rohingya, now amplified on social media. The typical rhetoric calls on the nation to “protect race and religion” against the “Bengalis” – refusing to even acknowledge the term “Rohingya,” thus reaffirming the narrative that these people are “illegal immigrants” from Bangladesh.
In the aftermath of the military campaign against the Rohingya in northern Rakhine state in August and September 2017, Senior General Min Aung Hlaing, commander-in-chief of Myanmar’s armed forces, posted on his Facebook page:
The Bengali population exploded, and the aliens tried to seize the land of local ethnics… Race cannot be swallowed by the ground but only by another race. All must be loyal to the State in serving their duties, so that such cases will never happen again.
Together, the military and their supporters have shaped a narrative in which the armed forces are heroes protecting against an existential threat posed by the Rohingya Muslims to Myanmar and the Buddhist way of life.
It is telling, then, that in the immediate wake of the recent UN Fact-Finding Mission (FFM) to Myanmar report accusing the military of genocide, crimes against humanity, and war crimes, Facebook removed a total of 18 user accounts, one Instagram account, and 52 pages followed by nearly 12 million people – including the account of commander-in-chief of the armed forces, and the military’s Myawady television network.
While only time will tell the overall impact of Facebook’s actions, what is certain is that the social media giant appears to be finally coming to terms with its responsibility in Myanmar as an outsize influence on news and information. More importantly, however, it reflects the past failings of Facebook to react quickly enough to such vitriol spreading online.
The Uncomfortable Digital Reality
Concerns about the ineptitude of Facebook (which owns WhatsApp and Instagram) and Twitter in dealing with misinformation have come to the forefront in the wake of the 2016 presidential election in the United States. While Western countries with strong democratic institutions such as Germany have grappled with false narratives about immigrants raping and attacking citizens, the situation is undeniably worse in countries with a weak rule of law and poor digital literacy. For instance, since April 2018, angry vigilante mobs in India – infuriated by rumors and fake news of child kidnapping spread via WhatsApp – have killed more than two dozen innocent people.
As a result of this, more countries have begun looking at ways to hold social media giants responsible for the content that is produced and shared on their platforms. In Germany, the controversial and much criticized Network Enforcement Act, popularly known as NetzDG, places a burden upon network service providers to take down or block “unlawful content” within 24 hours or seven days, depending on the type of content, after receiving a report on it.
Expecting similar legislation in Myanmar would be impossible. In a country where people in power openly compare the Rohingya to crows, dogs, and insects; call them “kalar” (dark skinned), terrorists, or terrorist sympathizers; accuse them of wanting to dominate the country by having too many children; and even tell them to get ready to be killed, there is little political will for legislation designed to protect the Rohingya and other groups from dangerous speech.
Perhaps more chillingly, speeches like these are eerily reminiscent of the hate propaganda prior to the Rwandan genocide, when President Grégoire Kayibanda declared: “Our party is concerned with the interest of the Hutu who have been dominated and scorned by the Tutsi who invaded the country. We have to be the light of the mass, we have to capture back the country and return it to the true owners. The country belongs to the Hutu.” Likewise, Radio Television Libre des Milles Collines and Kangura Newspaper mocked the Tutsis as cockroaches, and radio presenters celebrated the deaths of Tutsis on air.
There are murkier questions on freedom of speech and information in a country like Myanmar. Long suppressed by the military, today millions have access to mobile phones after the government relinquished control over mobile communications fees. A SIM card is now available for around $1.50 when it used to cost about $2,500 more than a decade ago. With the subsequent explosion of new mobile phone users, the state’s previously tight censorship of the digital sphere has been lifted. This has led to more freedoms but also comes at a cost: the ability to spread hate and misinformation.
It is for these reasons that Facebook’s volte-face (albeit too little, too late for those who have died in the conflict) is welcome. Indeed, more must be done to protect vulnerable minorities in cyberspace. The fact that the FFM report explicitly singled out Facebook’s previous inaction in a country “where for most users Facebook is the internet,” reflects the growing consensus that social media companies can no longer afford to sit idly by and profess neutrality.
Facebook is beginning to hire more Burmese speaking staff to moderate content. They currently have 60 employees sifting through content from Myanmar, nowhere near enough to cover a user population that has anywhere between 15-20 million monthly active users.
What’s Next for Social Media?
That Facebook is finally waking up to its power, and the responsibilities that come with it, is a welcome sign.But it is sadly an isolated case – as far as we can tell, Twitter has not acted at all thus far, and there are surely other platforms on which such hateful content thrives (in fact, activity on Senior General Min Aung Hlaing’s Twitter account spiked to three times its daily average after the Facebook ban).
Facebook is not guilty of churning out overtly murderous content the same way the Rwandan radio stations and newspapers were, but it certainly played a pivotal role in enabling those with such thoughts to reach a massive audience.
While it is certainly not the job of social media platforms to decide what is and is not within the boundaries of free speech, nevertheless, in extreme situations like Myanmar where violence is threatened, or purposefully incendiary content is posted, the time for inaction is long over. They must be mindful moderators of their platforms – respond faster, do better, lest there be blood on their hands.
More than two decades ago, those spreading hateful propaganda in Rwanda would have salivated at the thought of social media as we know it today. We are painfully aware of the textbook genocide lessons of Rwanda – and yet, governments and nonstate actors like Facebook seem inclined to inaction in spite of all we know. The Rohingya continue to suffer because of this, and the question remains: How many more will have to die before others like Facebook wake up and realize they have been complicit by failing to act? Only time will tell – and what a horrific tale it will be.
Eric Paulsen is Legal Director of Fortify Rights.