Facebook has apologized for its role in Sri Lanka’s 2018 anti-Muslim riots after an investigation found that incendiary content may have led to the deadly violence.
Facebook’s failure to remove hate speech and disinformation during the 2018 riots resulted in spurring on the deadly violence which had erupted in Sri Lanka. The investigation as carried out by Article One, which published a report as part of a two-year partnership with Facebook, involved conducting assessments into human rights in Sri Lanka.
Although everyone has the right to free speech, it should not be used as a guise to incite hate against others nor should it be a catalyst for planning attacks against minorities.
The social media giant issued a statement to Bloomberg, “deploring” the misuse of its platform, stating that “We recognize, and apologize for, the very real human rights impacts that resulted.” However, the significant damage, strained relations between communities and rise in Islamophobia is already prevalent in the country and has further fueled tensions.
As a Sri Lankan Muslim myself and journalist who covered the riots in Sri Lanka, I believe that social media platforms like Facebook need to make fundamental changes to their platform to protect minority groups from content that could incite violence against them and further fuel divisions and pre-existing Islamophobia.
The riots in early 2018 started in the district of Ampara in the eastern province, where anti-Muslim hate was whipped up via posts across social media platforms such as Facebook. Facebook, over any other social media platform, was used by Buddhist nationalists to spread propaganda against the Muslim minority in Sri Lanka which makes up just 10 percent of the country’s population.
In a report by Sanjana Hottotuwa, the editor of Groundviews, outlined how content generation trends from Facebook were studied across 465 accounts and were reported to have content that framed Sinhalese Buddhists as being “under threat” from Islam and Muslims, and “consequently in need of urgent and if necessary violent pushback.”
The Facebook pages revealed an increase in anti-Muslim content just before the violence erupted in March 2018. At least three people were killed and 20 injured in the 2018 unrest. Mosques and Muslim businesses were burnt by majority Sinhalese Buddhist nationalists.
A nationwide block on Facebook, Whatsapp and Twitter was subsequently issued by the government following the mob violence against Sri Lankan Muslims, but this only lasted for 72 hours before the block was lifted. The riots then continued, with many victims stating that they felt that the police were enabling the attacks and not taking any action against the perpetrators. Government officials and those in power in Sri Lanka did not speak out on social media to quell the violence, despite having put a block on the social media networks in a bid to stop further planning of attacks against Muslims. The lack of condemnation of the attacks from those in power in Sri Lanka resulted in the mobs finding their feet to incite further violence.
The 2018 Sri Lankan riots are a testament to the perils of social media that is not monitored sufficiently and puts the most vulnerable in society at risk. Islamophobia has intensified over the years in Sri Lanka and rose again after the Easter attacks in April 2019, leaving many Muslims vulnerable to being systematically targeted by nationalists.
In addition, with the unprecedented outbreak of COVID-19 Muslims in Sri Lanka are being targeted by the authorities, their dead cremated against the community’s will. This has happened even when they have been found not to have tested positive for COVID-19.
It is for this reason that it is even more imperative that Facebook takes concerted effort to take down posts which hold disinformation, incite hate and violence and have the potential to cause discord between communities. Freedom of speech should not be used as a guise to incite any form of hate toward communities. Yet, Facebook has failed to fully realize its mistake and there are continued instances in which posts go unchecked, leaving room for disinformation and propaganda to be spread around the world.
Facebook and other social media networks need to take responsibility over the content that they allow on their platforms. The mishandling of misinformation and fake news can result in disastrous consequences, which is why the platforms need to adopt systems that can verify sources, photos or information. All hate speech must be taken off the platform with immediate affect and investments need to be made in the moderation of posts that could potentially have life threatening consequences if viewed by the wrong eyes.
Now that Facebook has apologized over its role in the 2018 Sri Lanka riots, accountability must be sought to take further action so that they invest in tackling fake news, disinformation and misinformation and ensure that such an incident never erupts again.
Tasnim Nazeer is an award-winning journalist and author who has written for a variety of print and online publications including The Guardian, Al Jazeera English, CNN, HuffPost UK and many more.