Meta, the owner of Facebook, says it will undertake additional efforts to combat misinformation, hate speech, and online manipulation on the platform during Malaysia’s upcoming general election.
In a media briefing yesterday, Roy Tan, the head of politics and government outreach for Meta’s Asia-Pacific operations, said that the company would attempt to make it harder to interfere with elections on Facebook and its other platforms like WhatsApp in the run-up to the November 19 election.
“We are committed to tackling harmful content on our platform, which we know may intensify during elections, from misinformation, misrepresentation and interference, to hate speech,” Tan said, according to a report by TheVibes.com. “We have dedicated teams working on every upcoming election around the world, as it is incredibly important to us as a company that we get this right ahead of Malaysia’s 15th general election.”
Specifically, Tan said that Meta will help undertake efforts to fight misinformation, removing content on Facebook and Instagram that discourages voting or interferes with voting, and reducing the reach and distribution of posts that are deemed false or misleading. It also plans to support digital literacy in Malaysia and to increase the transparency of political advertising purchased on Facebook.
Well, it might. Meta has attracted fierce criticism for the inadequacy of its past responses to the use of its platform for large-scale political misinformation campaigns and the propagation of hate speech. In May, Philippine President Ferdinand Marcos Jr. rode to power on a massive campaign of disinformation aimed at whitewashing the repression and corruption of his father’s rule in the 1970s and 1980s. The same was true of his predecessor Rodrigo Duterte, who unleashed a “troll army” on social media to drown out and shout down his opponents.
An extreme example is Myanmar, where Facebook played a well-recognized (even by Meta) role in the military’s campaign of forced displacement and alleged genocide against the Rohingya communities of the country’s west in 2016 and 2017. According to a New York Times investigation into Facebook’s role in the anti-Rohingya campaign, Myanmar military personnel had “turned the social network into a tool for ethnic cleansing.”
In a recent report, the international rights group Amnesty International claimed that this was no crime of omission: Meta’s “dangerous algorithms and reckless pursuit of profit… substantially contributed to the atrocities perpetrated by the Myanmar military against the Rohingya people in 2017,” the report concluded. As a result, it recommended that Meta pay reparations “to all those who suffered the violent consequences of their reckless actions.”
On the one hand, it is encouraging that Meta is taking steps to ensure that its platforms are not abused by political actors, beyond its usual detection of “coordinated inauthentic behavior” on platforms like Facebook. As with its 2018 mea culpa for its role in the violence against the Rohingya in Myanmar, the announcement ahead of the Malaysian election is a sign that Facebook recognizes the responsibility that it has to ensure that it does not become an unwitting participant in campaigns aiming to disseminate disinformation and incite offline violence.
On the other hand, there are serious doubts as to whether Facebook’s business model, which is based on ginning up as much traffic as possible in order to generate advertising revenue, can be reconciled with a genuine attempt to cleanse its networks of misinformation and falsehood. As the former Cambodian parliamentarian, Mu Sochua, wrote in these pages prior to the Philippine election, “too often what grabs our attention is sensationalism and outrage rather than fact-based information, thus allowing disinformation, extremism, and division to be massively amplified.”
The Malaysian election will provide an opportunity to assess whether Meta can manage to reconcile these contradictions. Given its past performance and present gargantuan profits, there is not much reason to accord it the benefit of the doubt.