How Facebook Is Complicit in Myanmar’s Attacks on Minorities

Recent Features

ASEAN Beat | Politics | Southeast Asia

How Facebook Is Complicit in Myanmar’s Attacks on Minorities

Facebook faces accusations of favoring a majoritarian regime in Myanmar.

How Facebook Is Complicit in Myanmar’s Attacks on Minorities
Credit: Pixabay

On August 14, the Wall Street Journal published a damning report revealing how Facebook India’s head of public policy, Ankhi Das, “opposed applying hate speech rules” to at least four figures from the ruling Bharatiya Janata Party (BJP) who had posted violent, Islamophobic content on their profiles.

According to the report, Das did so to remain in the ruling party’s good books and protect the social media giant’s business prospects in India. This is not the first time Facebook has been accused of looking away from majoritarian hatred that members of a ruling regime have peddled on its platform.

In fact, the U.S.-based company, headed by Mark Zuckerberg, is currently mired in a similar debacle in India’s eastern neighbor, Myanmar — a country that today stands accused at the International Court of Justice (ICJ) of perpetrating genocide against the Rohingya Muslim community.

Non-Cooperation in Genocide Trial

In June, The Gambia — a small West African country that brought unprecedented charges against Myanmar at the ICJ — had filed a request at a U.S. federal court seeking some critical information from Facebook that could hold Myanmar accountable for its alleged misconduct against the Rohingya during “clearance operations” in 2016 and 2017.

It specifically asked for “documents and communications from Myanmar military officials” and information from several other pages and accounts that Facebook had taken down (and preserved) in 2018 after a U.N. fact-finding mission indicted the social media company for playing a ‘“determining” role in the genocide against Rohingya Muslims.

If The Gambia’s legal team at the ICJ is able to procure these critical documents, they could prove beyond reasonable doubt that Myanmar security forces used targeted violence against the Rohingya Muslims with specific “genocidal intent.” This is a necessary precondition to show that Myanmar violated the Genocide Convention 1948, to which it is a state party.

Facebook, however, rejected the request earlier this month. It told the U.S. court that The Gambia’s request was “extraordinarily broad” and would mean providing “special and unbounded access” to accounts.

Yet, according to a recent report in Time, The Gambia’s request was anything but “extraordinarily broad,” since it specifically named “17 officials, two military units and dozens of pages and accounts.”

Facebook also argued that The Gambia’s request is “in violation of U.S. law, specifically a section of the Stored Communications Act (SCA).” Time report posited that argument is invalid because the SCA doesn’t protect “unlawful actions of state actors.”

Double Standards

All of this comes two years after Facebook admitted in an official statement that it hadn’t done enough to prevent its platform “from being used to foment division and incite offline violence” in Myanmar. It had further said: “We know we need to do more to ensure we are a force for good in Myanmar, and in other countries facing their own crises.”

Facebook had also categorically said that its own internally-sanctioned Human Rights Impact Assessment recommended that it “preserve and share data where it can be used to evaluate international human rights violations, and that the company publish data specific to Myanmar so that the local and international community can evaluate progress more effectively.”

But today, Facebook seems to be going back on its promises. By not cooperating with The Gambian legal team and creating a critical roadblock in the ongoing ICJ trial, Facebook is certainly not doing more to ensure that it is a “force for good in Myanmar.” It is also failing to aid an important international effort to establish accountability in the country.

This is worrying because Facebook is an overwhelming force in Myanmar. As observed by the U.N. International Fact-Finding Mission (IFFM) on Myanmar in its September 2018 report, “for most users [in Myanmar], Facebook is the internet.” The mission also noted that “Facebook has been used to spread hate” in the country and regretted that the company was unable to provide country-specific information about hate speech on its platform.

Before that in April 2018, Zuckerberg, during a grueling U.S. congressional hearing, admitted that there was a problem of hate speech in Myanmar, and Facebook was hiring “dozens” of Burmese language content reviewers to better regulate language-specific hateful content.

But four months later, a Reuters analysis found that hate speech was still flourishing on Facebook in Myanmar. One of the many such posts, with reference to the Rohingya, said: “We must fight them the way Hitler did the Jews, damn kalars!”

“Kalars” is a common racial expletive used against the Rohingya (among other groups) in Myanmar.

Earlier in April 2018, digital researcher and analyst Raymond Serrato had told The Guardian that “Facebook definitely helped certain elements of society to determine the narrative of the conflict in Myanmar.”

Serrato had profiled close to 15,000 Facebook posts by members of the Buddhist ultranationalist group, Ma Ba Tha, and found that anti-Rohingya hate speech went up by 200 percent after the August 2017 attacks by Rohingya insurgents on border police outposts in northern Rakhine State and the military’s violent and possibly genocidal “clearance operations” that followed.

Pro-Regime Bias

While so far, no Ankhi Das-like smoking gun has emerged in Myanmar to show overt pro-regime bias on Facebook’s part, there have been a few markers on the way that indicate that the social media company has favored majoritarian state and quasi-state actors over minority groups.

For instance, in September 2017, Facebook banned the Arakan Rohingya Salvation Army (ARSA) — a Rohingya insurgent outfit behind the August 25, 2017 coordinated attacks on border police outposts in northern Rakhine — designating it as a “dangerous organization.” It also sanctioned the removal of all posts that even praise ARSA. Less than a month earlier, on the day of the attacks, the Myanmar government had declared ARSA a “terrorist organization.”

But Facebook did not impose any such strikes on accounts operated by or linked to the Myanmar military until August 2018. This is despite the widely reported havoc caused by the military’s violent scorched-earth campaigns in Rohingya villages, which triggered a massive humanitarian crisis by forcing 800,000 Rohingya to flee to Bangladesh.

Then in February 2019, Facebook designated four other Ethnic Armed Organizations (EAOs), including the Rakhine-based Arakan Army (AA), as “dangerous organizations,” restricting “all related praise, support and representation” for the minority rebel groups. Two months after the strikedown, the Myanmar government formally classified the AA as a “terrorist group and unlawful organization.”

Interestingly, Twitter continues to host the official accounts of both the AA or the ARSA. Both groups continue to use the platform to release statements.

Activists in Myanmar flagged Facebook’s ban as pro-regime, anti-minority conduct and a violation of freedom of speech. One Yangon-based analyst, Ko Maw Htun Aung, told The Irrawaddy that Facebook had become “frenetic” and was “overreacting to their past failure,” by which he meant the Rohingya crisis. One human rights observer told The Guardian that Facebook was “tipping the scales” toward the military and providing “a big boost for the government.”

In the Myanmar context, the scales are already tipped in the military’s favor when it comes to the civil conflict. The Myanmar military, or Tatmadaw, which ruled the country with an iron fist for 60 years, is a majoritarian institution dominated by the Bamar ethnic group. It has a long history of violent conduct against ethnic minorities, including the Kachins, Karens and Shans. Despite the coming of a democratically elected government led by Aung San Suu Kyi in 2015, the Tatmadaw continues to maintain a firm grip over Myanmar’s national affairs.

While the Suu Kyi government began a peace process in 2015, the military has been locked in a never-ending attritional war with the four minority rebel groups that Facebook banned in February. The battle-hardened and heavily armed Tatmadaw’s offensives in rebel-held minority areas have caused large-scale destruction and displacement of civilians.

In fact, the 2018 U.N. fact-finding report had noted that “actions of the Tatmadaw in both Kachin and Shan States since 2011 amount to war crimes and crimes against humanity.” Despite this, Facebook is yet to impose any across-the-board ban, of the kind slapped on the four minority rebel groups, on military-run accounts. Instead, it seems to be taking a deeply statist approach toward these groups, thus helping an army that stands accused of genocide.

Business Over Justice

But why would Facebook favor the regime in Myanmar?

For the same reason it would do so in India: to protect business interests in a domestic market that it currently dominates by a wide margin. Imposing bans on government- or military-linked accounts could dilute this monopoly by drawing the ire of state regulators. Such a policy also risks angering the general public, especially the majority constituencies. Not the best PR strategy for a profit-maximizing entity.

This is disappointing, but not surprising. Most corporate multinationals behave in the exact same way when it comes to preserving their space in valuable national markets. To do so, a company needs to have a stable working relationship with the powers that be. In fact, even the U.N. in Myanmar was once accused of looking the other way when the Rohingya came under attack in 2012, all to maintain the goodwill of the government in power.

The Ankhi Das expose and the Myanmar story indicate a worrying new reality. When social media giants preoccupy themselves with their annual returns instead of focusing on building safe online spaces, peace, justice, and accountability suffer. In fact, given the pervasiveness of social media today, such behavior by powerful companies like Facebook categorically empowers authoritarian regimes backed by majorities and, consequently, put vulnerable minority groups in the direct line of fire.

Without meaningful policy changes and a sincere commitment to values of justice, co-existence and accountability, social media platforms are bound to do more harm than good in today’s hate-filled online environs. Currently, the largest social media company in the world appears complicit in speeding up this avoidable trajectory.

Angshuman Choudhury is a senior researcher and coordinator of the Southeast Asia Research Programme at the Institute of Peace and Conflict Studies, New Delhi, and former GIBSA visiting fellow to the German Institute for International and Security Affairs in Berlin.