In a new report published on May 26, tech giant Facebook has presented a comprehensive look into how the platform has been used for covert influence operations. Covering a span of three years, from 2017 to the middle of this year, Facebook’s “The State of Influence Operations 2017-2020” report identifies the top five sources of global “coordinated inauthentic behavior” (CIB) campaigns. It draws on over 150 CIB operations that Facebook has identified, reported, and disrupted.
The social media behemoth defines CIBs as a subset of influence operations, involving state or non-state actors, that use coordinated networks of Facebook pages, accounts and groups to deceive the platform as well as its users about their intent and identity, crucially relying on fake accounts.
In terms of the geographical sources of CIB networks, Russia ranks first in the new report, with Facebook having identified 27 CIB networks emanating from that country, with Iran (23 networks) at number two and Myanmar ranking third (with nine CIB networks). The United States ranked fourth and Ukraine fifth as sources of CIBs, with nine and eight identified CIB networks, respectively. The report identifies the United States, Russia, and Iran as being the top three sources of CIBs that targeted the U.S. in the run-up to the presidential elections last year.
The United States, Ukraine, and the United Kingdom were the top three targets of foreign influence operators, measured by Facebook through the number of CIBs it removed that were targeting these countries.
Speaking to The Diplomat, Facebook’s head of security policy Nathaniel Gleicher emphasized that Facebook identifies CIBs based on their behavior – in particular, their critical reliance on fake accounts – as opposed to the content propagated through these networks. This strategy, the new reports claims, enables the platform to clearly delineate CIBs from other types of influence operations such as amplification of authentic accounts for strategic ends among a target audience.
The report observes that 45 percent of the CIB networks it has identified and disrupted between 2017 and May 2021 exclusively targeted a domestic audience, against 38 percent that chose to solely influence publics outside the countries they operated from. The remaining 17 percent of CIBs it detected focused on both domestic and foreign audiences.
It notes that “[d]espite the fact that public discourse in the U.S. shifted from focusing on foreign operations in 2017-2019 to focusing on domestic operations in 2020, we continued to see significant portions of all three types, and a steady rise in mixed targeting from 2018 through 2020.”
With increasing use of automated detection methods that identify and block fake accounts without human intervention, actors have shifted their strategy from a “wholesale” approach to a “retail” one that uses fewer resources and targets smaller, specific audiences, the Facebook report notes. This marks a noticeable shift from 2017, when the tech giant issued its first influence operations white paper. A retail approach is costly in terms of resources it consumes, involving creating multiple accounts across platforms in order to provide “cover” for fake personas. The report claims that such a retail approach to influence operations has, so far, met with extremely limited success.
Another trend the new Facebook report identifies is the rise of “influence operators for hire”: commercial firms, including media, marketing and public relations firms, that target audiences both at home and abroad at the behest of a given actor. While the use of commercial actors makes attribution particularly difficult, it also carries significant limitations, including the fact that many of them may lack the necessary specific domestic background (especially if they operate across multiple geographies) to appear convincing to the target audience.
A third trend that Facebook identifies in its new report is the use of multiple platforms by influence operators, both experienced and novice, in order to survive detection and removal by any one of them. It provided the example of aRep Global, an Indian digital marketing company, that it removed from its platform in February last year. The firm also operated on Twitter, Medium, Instagram, and other social media sites and “focused on a wide range of topics: from politics in the Gulf region to the 2022 FIFA World Cup in Qatar,” the Facebook report notes.
Gleicher noted that countering influence operations remain “team sports.” “Today, there are expert defenders from a range of stakeholders/actors – including industry and civil society – working together to stop these threats, and we believe a coordinated approach across the whole of society remains critical,” he added.